Science.gov

Sample records for agent-based simulation framework

  1. A Multi Agent-Based Framework for Simulating Household PHEV Distribution and Electric Distribution Network Impact

    SciTech Connect

    Cui, Xiaohui; Liu, Cheng; Kim, Hoe Kyoung; Kao, Shih-Chieh; Tuttle, Mark A; Bhaduri, Budhendra L

    2011-01-01

    The variation of household attributes such as income, travel distance, age, household member, and education for different residential areas may generate different market penetration rates for plug-in hybrid electric vehicle (PHEV). Residential areas with higher PHEV ownership could increase peak electric demand locally and require utilities to upgrade the electric distribution infrastructure even though the capacity of the regional power grid is under-utilized. Estimating the future PHEV ownership distribution at the residential household level can help us understand the impact of PHEV fleet on power line congestion, transformer overload and other unforeseen problems at the local residential distribution network level. It can also help utilities manage the timing of recharging demand to maximize load factors and utilization of existing distribution resources. This paper presents a multi agent-based simulation framework for 1) modeling spatial distribution of PHEV ownership at local residential household level, 2) discovering PHEV hot zones where PHEV ownership may quickly increase in the near future, and 3) estimating the impacts of the increasing PHEV ownership on the local electric distribution network with different charging strategies. In this paper, we use Knox County, TN as a case study to show the simulation results of the agent-based model (ABM) framework. However, the framework can be easily applied to other local areas in the US.

  2. A Scaffolding Framework to Support Learning of Emergent Phenomena Using Multi-Agent-Based Simulation Environments

    NASA Astrophysics Data System (ADS)

    Basu, Satabdi; Sengupta, Pratim; Biswas, Gautam

    2015-04-01

    Students from middle school to college have difficulties in interpreting and understanding complex systems such as ecological phenomena. Researchers have suggested that students experience difficulties in reconciling the relationships between individuals, populations, and species, as well as the interactions between organisms and their environment in the ecosystem. Multi-agent-based computational models (MABMs) can explicitly capture agents and their interactions by representing individual actors as computational objects with assigned rules. As a result, the collective aggregate-level behavior of the population dynamically emerges from simulations that generate the aggregation of these interactions. Past studies have used a variety of scaffolds to help students learn ecological phenomena. Yet, there is no theoretical framework that supports the systematic design of scaffolds to aid students' learning in MABMs. Our paper addresses this issue by proposing a comprehensive framework for the design, analysis, and evaluation of scaffolding to support students' learning of ecology in a MABM. We present a study in which middle school students used a MABM to investigate and learn about a desert ecosystem. We identify the different types of scaffolds needed to support inquiry learning activities in this simulation environment and use our theoretical framework to demonstrate the effectiveness of our scaffolds in helping students develop a deep understanding of the complex ecological behaviors represented in the simulation..

  3. Agent-Based Spatiotemporal Simulation of Biomolecular Systems within the Open Source MASON Framework

    PubMed Central

    Pérez-Rodríguez, Gael; Pérez-Pérez, Martín; Glez-Peña, Daniel; Azevedo, Nuno F.; Lourenço, Anália

    2015-01-01

    Agent-based modelling is being used to represent biological systems with increasing frequency and success. This paper presents the implementation of a new tool for biomolecular reaction modelling in the open source Multiagent Simulator of Neighborhoods framework. The rationale behind this new tool is the necessity to describe interactions at the molecular level to be able to grasp emergent and meaningful biological behaviour. We are particularly interested in characterising and quantifying the various effects that facilitate biocatalysis. Enzymes may display high specificity for their substrates and this information is crucial to the engineering and optimisation of bioprocesses. Simulation results demonstrate that molecule distributions, reaction rate parameters, and structural parameters can be adjusted separately in the simulation allowing a comprehensive study of individual effects in the context of realistic cell environments. While higher percentage of collisions with occurrence of reaction increases the affinity of the enzyme to the substrate, a faster reaction (i.e., turnover number) leads to a smaller number of time steps. Slower diffusion rates and molecular crowding (physical hurdles) decrease the collision rate of reactants, hence reducing the reaction rate, as expected. Also, the random distribution of molecules affects the results significantly. PMID:25874228

  4. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE PAGESBeta

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  5. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    SciTech Connect

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  6. An agent-based framework for fuel cycle simulation with recycling

    SciTech Connect

    Gidden, M.J.; Wilson, P.P.H.; Huff, K.D.; Carlsen, R.W.

    2013-07-01

    Simulation of the nuclear fuel cycle is an established field with multiple players. Prior development work has utilized techniques such as system dynamics to provide a solution structure for the matching of supply and demand in these simulations. In general, however, simulation infrastructure development has occurred in relatively closed circles, each effort having unique considerations as to the cases which are desired to be modeled. Accordingly, individual simulators tend to have their design decisions driven by specific use cases. Presented in this work is a proposed supply and demand matching algorithm that leverages the techniques of the well-studied field of mathematical programming. A generic approach is achieved by treating facilities as individual entities and actors in the supply-demand market which denote preferences amongst commodities. Using such a framework allows for varying levels of interaction fidelity, ranging from low-fidelity, quick solutions to high-fidelity solutions that model individual transactions (e.g. at the fuel-assembly level). The power of the technique is that it allows such flexibility while still treating the problem in a generic manner, encapsulating simulation engine design decisions in such a way that future simulation requirements can be relatively easily added when needed. (authors)

  7. An Agent-based Framework for Web Query Answering.

    ERIC Educational Resources Information Center

    Wang, Huaiqing; Liao, Stephen; Liao, Lejian

    2000-01-01

    Discusses discrepancies between user queries on the Web and the answers provided by information sources; proposes an agent-based framework for Web mining tasks; introduces an object-oriented deductive data model and a flexible query language; and presents a cooperative mechanism for query answering. (Author/LRW)

  8. SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling

    PubMed Central

    Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi

    2013-01-01

    Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster. PMID:24163721

  9. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  10. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  11. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  12. Tutorial on agent-based modeling and simulation.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  13. Agent-based modeling and simulation Part 3 : desktop ABMS.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  14. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    NASA Astrophysics Data System (ADS)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  15. Agent-Based Modeling and Simulation on Emergency Evacuation

    NASA Astrophysics Data System (ADS)

    Ren, Chuanjun; Yang, Chenghui; Jin, Shiyao

    Crowd stampedes and evacuation induced by panic caused by emergences often lead to fatalities as people are crushed, injured, trampled or even dead. Such phenomena may be triggered in life-threatening situations such as fires, explosions in crowded buildings. Emergency evacuation simulation has recently attracted the interest of a rapidly increasing number of scientists. This paper presents an Agent-Based Modeling and Simulation using Repast software to construct crowd evacuations for emergency response from an area under a fire. Various types of agents and different attributes of agents are designed in contrast to traditional modeling. The attributes that govern the characteristics of the people are studied and tested by iterative simulations. Simulations are also conducted to demonstrate the effect of various parameters of agents. Some interesting results were observed such as "faster is slower" and the ignorance of available exits. At last, simulation results suggest practical ways of minimizing the harmful consequences of such events and the existence of an optimal escape strategy.

  16. Agent-based simulation of a financial market

    NASA Astrophysics Data System (ADS)

    Raberto, Marco; Cincotti, Silvano; Focardi, Sergio M.; Marchesi, Michele

    2001-10-01

    This paper introduces an agent-based artificial financial market in which heterogeneous agents trade one single asset through a realistic trading mechanism for price formation. Agents are initially endowed with a finite amount of cash and a given finite portfolio of assets. There is no money-creation process; the total available cash is conserved in time. In each period, agents make random buy and sell decisions that are constrained by available resources, subject to clustering, and dependent on the volatility of previous periods. The model proposed herein is able to reproduce the leptokurtic shape of the probability density of log price returns and the clustering of volatility. Implemented using extreme programming and object-oriented technology, the simulator is a flexible computational experimental facility that can find applications in both academic and industrial research projects.

  17. Using Agent Based Modeling (ABM) to Develop Cultural Interaction Simulations

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Jones, Phillip N.

    2012-01-01

    Today, most cultural training is based on or built around "cultural engagements" or discrete interactions between the individual learner and one or more cultural "others". Often, success in the engagement is the end or the objective. In reality, these interactions usually involve secondary and tertiary effects with potentially wide ranging consequences. The concern is that learning culture within a strict engagement context might lead to "checklist" cultural thinking that will not empower learners to understand the full consequence of their actions. We propose the use of agent based modeling (ABM) to collect, store, and, simulating the effects of social networks, promulgate engagement effects over time, distance, and consequence. The ABM development allows for rapid modification to re-create any number of population types, extending the applicability of the model to any requirement for social modeling.

  18. Simulation of convoy of unmanned vehicles using agent based modeling

    NASA Astrophysics Data System (ADS)

    Sharma, Sharad; Singh, Harpreet; Gerhart, G. R.

    2007-10-01

    There has been an increasing interest of unmanned vehicles keeping the importance of defense and security. A few models for a convoy of unmanned vehicle exist in literature. The objective of this paper is to exploit agent based modeling technique for a convoy of unmanned vehicles where each vehicle is an agent. Using this approach, the convoy of vehicles reaches a specified goal from a starting point. Each agent is associated with number of sensors. The agents make intelligent decisions based on sensor inputs and at the same time maintaining their group capability and behavior. The simulation is done for a battlefield environment from a single starting point to a single goal. This approach can be extended for multiple starting points to reach multiple goals. The simulation gives the time taken by the convoy to reach a goal from its initial position. In the battlefield environment, commanders make various tactical decisions depending upon the location of an enemy outpost, minefields, number of soldiers in platoons, and barriers. The simulation can help the commander to make effective decisions depending on battlefield, convoy and obstacles to reach a particular goal. The paper describes the proposed approach and gives the simulation results. The paper also gives problems for future research in this area.

  19. Agent-Based Modeling of the Immune System: NetLogo, a Promising Framework

    PubMed Central

    Chiacchio, Ferdinando; Russo, Giulia; Pappalardo, Francesco

    2014-01-01

    Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms. PMID:24864263

  20. Agent-based modeling of the immune system: NetLogo, a promising framework.

    PubMed

    Chiacchio, Ferdinando; Pennisi, Marzio; Russo, Giulia; Motta, Santo; Pappalardo, Francesco

    2014-01-01

    Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms. PMID:24864263

  1. Agent-based modeling to simulate the dengue spread

    NASA Astrophysics Data System (ADS)

    Deng, Chengbin; Tao, Haiyan; Ye, Zhiwei

    2008-10-01

    In this paper, we introduce a novel method ABM in simulating the unique process for the dengue spread. Dengue is an acute infectious disease with a long history of over 200 years. Unlike the diseases that can be transmitted directly from person to person, dengue spreads through a must vector of mosquitoes. There is still no any special effective medicine and vaccine for dengue up till now. The best way to prevent dengue spread is to take precautions beforehand. Thus, it is crucial to detect and study the dynamic process of dengue spread that closely relates to human-environment interactions where Agent-Based Modeling (ABM) effectively works. The model attempts to simulate the dengue spread in a more realistic way in the bottom-up way, and to overcome the limitation of ABM, namely overlooking the influence of geographic and environmental factors. Considering the influence of environment, Aedes aegypti ecology and other epidemiological characteristics of dengue spread, ABM can be regarded as a useful way to simulate the whole process so as to disclose the essence of the evolution of dengue spread.

  2. Agent based modeling of blood coagulation system: implementation using a GPU based high speed framework.

    PubMed

    Chen, Wenan; Ward, Kevin; Li, Qi; Kecman, Vojislav; Najarian, Kayvan; Menke, Nathan

    2011-01-01

    The coagulation and fibrinolytic systems are complex, inter-connected biological systems with major physiological roles. The complex, nonlinear multi-point relationships between the molecular and cellular constituents of two systems render a comprehensive and simultaneous study of the system at the microscopic and macroscopic level a significant challenge. We have created an Agent Based Modeling and Simulation (ABMS) approach for simulating these complex interactions. As the scale of agents increase, the time complexity and cost of the resulting simulations presents a significant challenge. As such, in this paper, we also present a high-speed framework for the coagulation simulation utilizing the computing power of graphics processing units (GPU). For comparison, we also implemented the simulations in NetLogo, Repast, and a direct C version. As our experiments demonstrate, the computational speed of the GPU implementation of the million-level scale of agents is over 10 times faster versus the C version, over 100 times faster versus the Repast version and over 300 times faster versus the NetLogo simulation. PMID:22254271

  3. Patient-centered appointment scheduling using agent-based simulation.

    PubMed

    Turkcan, Ayten; Toscos, Tammy; Doebbeling, Brad N

    2014-01-01

    Enhanced access and continuity are key components of patient-centered care. Existing studies show that several interventions such as providing same day appointments, walk-in services, after-hours care, and group appointments, have been used to redesign the healthcare systems for improved access to primary care. However, an intervention focusing on a single component of care delivery (i.e. improving access to acute care) might have a negative impact other components of the system (i.e. reduced continuity of care for chronic patients). Therefore, primary care clinics should consider implementing multiple interventions tailored for their patient population needs. We collected rapid ethnography and observations to better understand clinic workflow and key constraints. We then developed an agent-based simulation model that includes all access modalities (appointments, walk-ins, and after-hours access), incorporate resources and key constraints and determine the best appointment scheduling method that improves access and continuity of care. This paper demonstrates the value of simulation models to test a variety of alternative strategies to improve access to care through scheduling. PMID:25954423

  4. Serious games experiment toward agent-based simulation

    USGS Publications Warehouse

    Wein, Anne; Labiosa, William

    2013-01-01

    We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information

  5. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    SciTech Connect

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran; Djilali, Ned

    2015-12-23

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methods presented.

  6. Identifying Evacuees' Demand of Tsunami Shelters using Agent Based Simulation

    NASA Astrophysics Data System (ADS)

    Mas, E.; Adriano, B.; Koshimura, S.; Imamura, F.; Kuroiwa, J.; Yamazaki, F.; Zavala, C.; Estrada, M.

    2012-12-01

    Amongst the lessons learned in tsunami events such as the 2004 Indian Ocean and 2011 Great Tohoku Japan earthquake is that sometimes nature exceeds structural countermeasures like seawalls, breakwaters or tsunami gates. In such situations it is a challenging task for people in plain areas to find sheltering places. The vertical evacuation to multistory buildings is one alternative to provide areas for sheltering in a complex environment of evacuation. However, if the spatial distribution and the available capacity of these structures are not well displayed, conditions of evacuee over-demand or under-demand might be observed in several structures. In this study, we present the integration of the tsunami numerical modeling and the agent based simulation of evacuation as the method to estimate the sheltering demand of evacuees in an emergent behavior approach. The case study is set in La Punta district in Peru. Here, we used in the tsunami simulation a seismic source of slip distribution model (Pulido et.al. ,2011; Chlieh et.al, 2011) for a possible future tsunami scenario in the central Andes. We modeled three alternatives of evacuation. First, the horizontal evacuation scenario was analyzed to support the necessity of the sheltering-in-place option for the district. Second, the vertical evacuation scenario and third, the combination of vertical and horizontal evacuation scenarios of pedestrians and vehicles were conducted. In the last two alternatives, the demand of evacuees were measured at each official tsunami evacuation building and compared to the sheltering capacity of the structure. Results showed that out of twenty tsunami evacuation buildings, thirteen resulted with over-demands and seven were still with available space. Also it is confirmed that in this case the horizontal evacuation might lead to a high number of casualties due to the traffic congestion at the neck of the district. Finally the vertical evacuation would be a suitable solution for this area

  7. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  8. Agent-Based Knowledge Discovery for Modeling and Simulation

    SciTech Connect

    Haack, Jereme N.; Cowell, Andrew J.; Marshall, Eric J.; Fligg, Alan K.; Gregory, Michelle L.; McGrath, Liam R.

    2009-09-15

    This paper describes an approach to using agent technology to extend the automated discovery mechanism of the Knowledge Encapsulation Framework (KEF). KEF is a suite of tools to enable the linking of knowledge inputs (relevant, domain-specific evidence) to modeling and simulation projects, as well as other domains that require an effective collaborative workspace for knowledge-based tasks. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a semantic wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  9. A Framework for Model-Based Inquiry Through Agent-Based Programming

    NASA Astrophysics Data System (ADS)

    Xiang, Lin; Passmore, Cynthia

    2015-04-01

    There has been increased recognition in the past decades that model-based inquiry (MBI) is a promising approach for cultivating deep understandings by helping students unite phenomena and underlying mechanisms. Although multiple technology tools have been used to improve the effectiveness of MBI, there are not enough detailed examinations of how agent-based programmable modeling (ABPM) tools influence students' MBI learning. The present collective case study sought to contribute by closely investigating ABPM-supported MBI processes for 8th grade students learning about natural selection and adaptation. Eight 8th grade students in groups of 2-3 spent 15 h during a span of 4 weeks collaboratively programming simulations of adaptation based on the natural selection model, using an ABPM tool named NetLogo. The entire programming processes of these learning groups, up to 50 h, were videotaped and then analyzed using mixed methods. Our analysis revealed that the programming task created a context that calls for nine types of MBI actions. These MBI actions were related to both phenomena and the underlying model. Results also showed that students' programming processes took place in consecutive programming cycles and aligned with iterative MBI cycles. A framework for ABPM-supported MBI learning is proposed based upon the findings. Implications in developing MBI instruction involving ABPM tools are discussed.

  10. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  11. Integrated Agent-Based and Production Cost Modeling Framework for Renewable Energy Studies: Preprint

    SciTech Connect

    Gallo, Giulia

    2015-10-07

    The agent-based framework for renewable energy studies (ARES) is an integrated approach that adds an agent-based model of industry actors to PLEXOS and combines the strengths of the two to overcome their individual shortcomings. It can examine existing and novel wholesale electricity markets under high penetrations of renewables. ARES is demonstrated by studying how increasing levels of wind will impact the operations and the exercise of market power of generation companies that exploit an economic withholding strategy. The analysis is carried out on a test system that represents the Electric Reliability Council of Texas energy-only market in the year 2020. The results more realistically reproduce the operations of an energy market under different and increasing penetrations of wind, and ARES can be extended to address pressing issues in current and future wholesale electricity markets.

  12. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    DOE PAGESBeta

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran; Djilali, Ned

    2015-12-23

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methodsmore » presented.« less

  13. Agent-Based Crowd Simulation Considering Emotion Contagion for Emergency Evacuation Problem

    NASA Astrophysics Data System (ADS)

    Faroqi, H.; Mesgari, M.-S.

    2015-12-01

    During emergencies, emotions greatly affect human behaviour. For more realistic multi-agent systems in simulations of emergency evacuations, it is important to incorporate emotions and their effects on the agents. In few words, emotional contagion is a process in which a person or group influences the emotions or behavior of another person or group through the conscious or unconscious induction of emotion states and behavioral attitudes. In this study, we simulate an emergency situation in an open square area with three exits considering Adults and Children agents with different behavior. Also, Security agents are considered in order to guide Adults and Children for finding the exits and be calm. Six levels of emotion levels are considered for each agent in different scenarios and situations. The agent-based simulated model initialize with the random scattering of agent populations and then when an alarm occurs, each agent react to the situation based on its and neighbors current circumstances. The main goal of each agent is firstly to find the exit, and then help other agents to find their ways. Numbers of exited agents along with their emotion levels and damaged agents are compared in different scenarios with different initialization in order to evaluate the achieved results of the simulated model. NetLogo 5.2 is used as the multi-agent simulation framework with R language as the developing language.

  14. An Agent-Based Optimization Framework for Engineered Complex Adaptive Systems with Application to Demand Response in Electricity Markets

    NASA Astrophysics Data System (ADS)

    Haghnevis, Moeed

    The main objective of this research is to develop an integrated method to study emergent behavior and consequences of evolution and adaptation in engineered complex adaptive systems (ECASs). A multi-layer conceptual framework and modeling approach including behavioral and structural aspects is provided to describe the structure of a class of engineered complex systems and predict their future adaptive patterns. The approach allows the examination of complexity in the structure and the behavior of components as a result of their connections and in relation to their environment. This research describes and uses the major differences of natural complex adaptive systems (CASs) with artificial/engineered CASs to build a framework and platform for ECAS. While this framework focuses on the critical factors of an engineered system, it also enables one to synthetically employ engineering and mathematical models to analyze and measure complexity in such systems. In this way concepts of complex systems science are adapted to management science and system of systems engineering. In particular an integrated consumer-based optimization and agent-based modeling (ABM) platform is presented that enables managers to predict and partially control patterns of behaviors in ECASs. Demonstrated on the U.S. electricity markets, ABM is integrated with normative and subjective decision behavior recommended by the U.S. Department of Energy (DOE) and Federal Energy Regulatory Commission (FERC). The approach integrates social networks, social science, complexity theory, and diffusion theory. Furthermore, it has unique and significant contribution in exploring and representing concrete managerial insights for ECASs and offering new optimized actions and modeling paradigms in agent-based simulation.

  15. An extensible simulation environment and movement metrics for testing walking behavior in agent-based models

    SciTech Connect

    Paul M. Torrens; Atsushi Nara; Xun Li; Haojie Zhu; William A. Griffin; Scott B. Brown

    2012-01-01

    Human movement is a significant ingredient of many social, environmental, and technical systems, yet the importance of movement is often discounted in considering systems complexity. Movement is commonly abstracted in agent-based modeling (which is perhaps the methodological vehicle for modeling complex systems), despite the influence of movement upon information exchange and adaptation in a system. In particular, agent-based models of urban pedestrians often treat movement in proxy form at the expense of faithfully treating movement behavior with realistic agency. There exists little consensus about which method is appropriate for representing movement in agent-based schemes. In this paper, we examine popularly-used methods to drive movement in agent-based models, first by introducing a methodology that can flexibly handle many representations of movement at many different scales and second, introducing a suite of tools to benchmark agent movement between models and against real-world trajectory data. We find that most popular movement schemes do a relatively poor job of representing movement, but that some schemes may well be 'good enough' for some applications. We also discuss potential avenues for improving the representation of movement in agent-based frameworks.

  16. Efficient Allocation of Resources for Defense of Spatially Distributed Networks Using Agent-Based Simulation.

    PubMed

    Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A

    2015-09-01

    This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach. PMID:25683347

  17. Developing Framework for Agent- Based Diabetes Disease Management System: User Perspective

    PubMed Central

    Mohammadzadeh, Niloofar; Safdari, Reza; Rahimi, Azin

    2014-01-01

    Background: One of the characteristics of agents is mobility which makes them very suitable for remote electronic health and tele medicine. The aim of this study is developing a framework for agent based diabetes information management at national level through identifying required agents. Methods: The main tool is a questioner that is designed in three sections based on studying library resources, performance of major organizations in the field of diabetes in and out of the country and interviews with experts in the medical, health information management and software fields. Questionnaires based on Delphi methods were distributed among 20 experts. In order to design and identify agents required in health information management for the prevention and appropriate and rapid treatment of diabetes, the results were analyzed using SPSS 17 and Results were plotted with FREEPLANE mind map software. Results: Access to data technology in proposed framework in order of priority is: mobile (mean 1/80), SMS, EMAIL (mean 2/80), internet, web (mean 3/30), phone (mean 3/60), WIFI (mean 4/60). Conclusions: In delivering health care to diabetic patients, considering social and human aspects is essential. Having a systematic view for implementation of agent systems and paying attention to all aspects such as feedbacks, user acceptance, budget, motivation, hierarchy, useful standards, affordability of individuals, identifying barriers and opportunities and so on, are necessary. PMID:24757407

  18. Evaluation of wholesale electric power market rules and financial risk management by agent-based simulations

    NASA Astrophysics Data System (ADS)

    Yu, Nanpeng

    dissertation, basic financial risk management concepts relevant for wholesale electric power markets are carefully explained and illustrated. In addition, the financial risk management problem in wholesale electric power markets is generalized as a four-stage process. Within the proposed financial risk management framework, the critical problem of financial bilateral contract negotiation is addressed. This dissertation analyzes a financial bilateral contract negotiation process between a generating company and a load-serving entity in a wholesale electric power market with congestion managed by locational marginal pricing. Nash bargaining theory is used to model a Pareto-efficient settlement point. The model predicts negotiation results under varied conditions and identifies circumstances in which the two parties might fail to reach an agreement. Both analysis and agent-based simulation are used to gain insight regarding how relative risk aversion and biased price estimates influence negotiated outcomes. These results should provide useful guidance to market participants in their bilateral contract negotiation processes.

  19. An agent-based simulation of extirpation of Ceratitis capitata applied to invasions in California

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We describe and validate an Agent-Based Simulation(ABS) of invasive insects and use it to investigate the time to extirpation of Ceratitis capitata using data from seven outbreaks that occurred in California from 2008-2010. Results are compared with the length of intervention and quarantine imposed ...

  20. Simulating tissue mechanics with agent-based models: concepts, perspectives and some novel results

    NASA Astrophysics Data System (ADS)

    Van Liedekerke, P.; Palm, M. M.; Jagiella, N.; Drasdo, D.

    2015-12-01

    In this paper we present an overview of agent-based models that are used to simulate mechanical and physiological phenomena in cells and tissues, and we discuss underlying concepts, limitations, and future perspectives of these models. As the interest in cell and tissue mechanics increase, agent-based models are becoming more common the modeling community. We overview the physical aspects, complexity, shortcomings, and capabilities of the major agent-based model categories: lattice-based models (cellular automata, lattice gas cellular automata, cellular Potts models), off-lattice models (center-based models, deformable cell models, vertex models), and hybrid discrete-continuum models. In this way, we hope to assist future researchers in choosing a model for the phenomenon they want to model and understand. The article also contains some novel results.

  1. Collaborative Multi-Agent Based Simulations: Stakeholder-Focused Innovation in Water Resources Management and Decision-Support Modeling

    NASA Astrophysics Data System (ADS)

    Kock, B. E.

    2006-12-01

    The combined use of multi-agent based simulations and collaborative modeling approaches is emerging as a highly effective tool for representing complex coupled social-biophysical water resource systems. A collaboratively-designed, multi-agent based simulation can be used both as a decision-support tool and as a didactic method for improving stakeholder understanding and engagement with water resources policymaking and management. Major technical and non-technical obstacles remain to the efficient and effective development of multi-agent models of human society, to integrating these models with GIS and other numerical models, and to building a process for engaging stakeholders with model design, implementation and use. It is proposed here to tackle some of these obstacles through a collaborative multi-agent based simulation process framework, intended for practical use in resolving disputes and environmental challenges over sustainable irrigated agriculture in the Western United States. A practical implementation of this framework will be conducted in collaboration with a diverse stakeholder group representing farmers and local, state and federal water managers. Through the use of simulation gaming, interviewing and computer-based knowledge elicitation, a multi-agent model representing local and regional social dynamics will be developed to support the acceptable and sustainable implementation of management alternatives for reducing regional problems of salinization and high selenium concentrations in soils and irrigation water. The development of a socially and scientifically credible simulation platform in this setting can make a significant contribution to ensuring the non-adversarial use of high quality science, enhance the engagement of stakeholders with policymaking, and help meet the challenges of integrating dynamic models of human society with more traditional biophysical systems models.

  2. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    NASA Technical Reports Server (NTRS)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  3. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass.

    PubMed

    Sánchez, Jesús M; Carrera, Álvaro; Iglesias, Carlos Á; Serrano, Emilio

    2016-01-01

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services. PMID:27563911

  4. Use of agent-based simulations to design and interpret HIV clinical trials.

    PubMed

    Cuadros, Diego F; Abu-Raddad, Laith J; Awad, Susanne F; García-Ramos, Gisela

    2014-07-01

    In this study, we illustrate the utility of an agent-based simulation to inform a trial design and how this supports outcome interpretation of randomized controlled trials (RCTs). We developed agent-based Monte Carlo models to simulate existing landmark HIV RCTs, such as the Partners in Prevention HSV/HIV Transmission Study. We simulated a variation of this study using valacyclovir therapy as the intervention, and we used a male circumcision RCT based on the Rakai Male Circumcision Trial. Our results indicate that a small fraction (20%) of the simulated Partners in Prevention HSV/HIV Transmission Study realizations rejected the null hypothesis, which was no effect from the intervention. Our results also suggest that an RCT designed to evaluate the effectiveness of a more potent drug regimen for HSV-2 suppression (valacyclovir therapy) is more likely to identify the efficacy of the intervention. For the male circumcision RCT simulation, the greater biological effect of the male circumcision yielded a major fraction (81%) of RCT realizations' that rejects the null hypothesis, which was no effect from the intervention. Our study highlights how agent-based simulations synthesize individual variation in the epidemiological context of the RCT. This methodology will be particularly useful for designing RCTs aimed at evaluating combination prevention interventions in community-based RCTs, wherein an intervention׳s effectiveness is challenging to predict. PMID:24792492

  5. Comparing stochastic differential equations and agent-based modelling and simulation for early-stage cancer.

    PubMed

    Figueredo, Grazziela P; Siebers, Peer-Olaf; Owen, Markus R; Reps, Jenna; Aickelin, Uwe

    2014-01-01

    There is great potential to be explored regarding the use of agent-based modelling and simulation as an alternative paradigm to investigate early-stage cancer interactions with the immune system. It does not suffer from some limitations of ordinary differential equation models, such as the lack of stochasticity, representation of individual behaviours rather than aggregates and individual memory. In this paper we investigate the potential contribution of agent-based modelling and simulation when contrasted with stochastic versions of ODE models using early-stage cancer examples. We seek answers to the following questions: (1) Does this new stochastic formulation produce similar results to the agent-based version? (2) Can these methods be used interchangeably? (3) Do agent-based models outcomes reveal any benefit when compared to the Gillespie results? To answer these research questions we investigate three well-established mathematical models describing interactions between tumour cells and immune elements. These case studies were re-conceptualised under an agent-based perspective and also converted to the Gillespie algorithm formulation. Our interest in this work, therefore, is to establish a methodological discussion regarding the usability of different simulation approaches, rather than provide further biological insights into the investigated case studies. Our results show that it is possible to obtain equivalent models that implement the same mechanisms; however, the incapacity of the Gillespie algorithm to retain individual memory of past events affects the similarity of some results. Furthermore, the emergent behaviour of ABMS produces extra patters of behaviour in the system, which was not obtained by the Gillespie algorithm. PMID:24752131

  6. Agent-based simulation of building evacuation using a grid graph-based model

    NASA Astrophysics Data System (ADS)

    Tan, L.; Lin, H.; Hu, M.; Che, W.

    2014-02-01

    Shifting from macroscope models to microscope models, the agent-based approach has been widely used to model crowd evacuation as more attentions are paid on individualized behaviour. Since indoor evacuation behaviour is closely related to spatial features of the building, effective representation of indoor space is essential for the simulation of building evacuation. The traditional cell-based representation has limitations in reflecting spatial structure and is not suitable for topology analysis. Aiming at incorporating powerful topology analysis functions of GIS to facilitate agent-based simulation of building evacuation, we used a grid graph-based model in this study to represent the indoor space. Such model allows us to establish an evacuation network at a micro level. Potential escape routes from each node thus could be analysed through GIS functions of network analysis considering both the spatial structure and route capacity. This would better support agent-based modelling of evacuees' behaviour including route choice and local movements. As a case study, we conducted a simulation of emergency evacuation from the second floor of an official building using Agent Analyst as the simulation platform. The results demonstrate the feasibility of the proposed method, as well as the potential of GIS in visualizing and analysing simulation results.

  7. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    SciTech Connect

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease states in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.

  8. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGESBeta

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  9. An Agent-Based Model of New Venture Creation: Conceptual Design for Simulating Entrepreneurship

    NASA Technical Reports Server (NTRS)

    Provance, Mike; Collins, Andrew; Carayannis, Elias

    2012-01-01

    There is a growing debate over the means by which regions can foster the growth of entrepreneurial activity in order to stimulate recovery and growth of their economies. On one side, agglomeration theory suggests the regions grow because of strong clusters that foster knowledge spillover locally; on the other side, the entrepreneurial action camp argues that innovative business models are generated by entrepreneurs with unique market perspectives who draw on knowledge from more distant domains. We will show you the design for a novel agent-based model of new venture creation that will demonstrate the relationship between agglomeration and action. The primary focus of this model is information exchange as the medium for these agent interactions. Our modeling and simulation study proposes to reveal interesting relationships in these perspectives, offer a foundation on which these disparate theories from economics and sociology can find common ground, and expand the use of agent-based modeling into entrepreneurship research.

  10. Tutorial on agent-based modeling and simulation. Part 2 : how to model with agents.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2006-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of interacting autonomous agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to do research. Some have gone so far as to contend that ABMS is a new way of doing science. Computational advances make possible a growing number of agent-based applications across many fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling the growth and decline of ancient civilizations to modeling the complexities of the human immune system, and many more. This tutorial describes the foundations of ABMS, identifies ABMS toolkits and development methods illustrated through a supply chain example, and provides thoughts on the appropriate contexts for ABMS versus conventional modeling techniques.

  11. Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.

    2014-12-01

    Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.

  12. A Unified Multiscale Field/Network/Agent Based Modeling Framework for Human and Ecological Health Risk Analysis

    PubMed Central

    Georgopoulos, Panos G.; Isukapalli, Sastry S.

    2011-01-01

    A conceptual framework is presented for multiscale field/network/agent-based modeling to support human and ecological health risk assessments. This framework is based on the representation of environmental dynamics in terms of interacting networks, agents that move across different networks, fields representing spatiotemporal distributions of physical properties, rules governing constraints and interactions, and actors that make decisions affecting the state of the system. Different deterministic and stochastic modeling case studies focusing on environmental exposures and associated risks are provided as examples, utilizing the bidirectional mapping between discrete, agent based approaches and continuous, equation based approaches. These examples include problems describing human health risk assessment, ecological risk assessment, and environmentally caused disease. PMID:19964423

  13. An Agent-Based Framework for Building Decision Support System in Supply Chain Management

    NASA Astrophysics Data System (ADS)

    Kazemi, A.; Fazel Zarandi, M. H.

    In this study, two scenarios are presented for solving Production-Distribution Panning Problem (PDPP) in a Decision Support System (DSS) framework. In the first scenario, a Traditional Decision Support System (TDSS) is presented for PDPP and a Genetic Algorithm (GA) is used for solving it. In the second scenario, a Multi-agent Decision Support System (MADSS) is considered for PDPP and three algorithms are used for solving it: Genetic Algorithm (GA), Tabu Search (TS) and Simulated Annealing (SA). Then an algorithm is suggested by using multi-agent system and A Teams concept. The obtained results reveal that the use of MADSS delivers better solutions to us.

  14. Using an agent-based model to simulate children’s active travel to school

    PubMed Central

    2013-01-01

    Background Despite the multiple advantages of active travel to school, only a small percentage of US children and adolescents walk or bicycle to school. Intervention studies are in a relatively early stage and evidence of their effectiveness over long periods is limited. The purpose of this study was to illustrate the utility of agent-based models in exploring how various policies may influence children’s active travel to school. Methods An agent-based model was developed to simulate children’s school travel behavior within a hypothetical city. The model was used to explore the plausible implications of policies targeting two established barriers to active school travel: long distance to school and traffic safety. The percent of children who walk to school was compared for various scenarios. Results To maximize the percent of children who walk to school the school locations should be evenly distributed over space and children should be assigned to the closest school. In the case of interventions to improve traffic safety, targeting a smaller area around the school with greater intensity may be more effective than targeting a larger area with less intensity. Conclusions Despite the challenges they present, agent based models are a useful complement to other analytical strategies in studying the plausible impact of various policies on active travel to school. PMID:23705953

  15. Model reduction for agent-based social simulation: coarse-graining a civil violence model.

    PubMed

    Zou, Yu; Fonoberov, Vladimir A; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G

    2012-06-01

    Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20). PMID:23005161

  16. Model reduction for agent-based social simulation: Coarse-graining a civil violence model

    NASA Astrophysics Data System (ADS)

    Zou, Yu; Fonoberov, Vladimir A.; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G.

    2012-06-01

    Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20).

  17. Quantitative agent-based firm dynamics simulation with parameters estimated by financial and transaction data analysis

    NASA Astrophysics Data System (ADS)

    Ikeda, Yuichi; Souma, Wataru; Aoyama, Hideaki; Iyetomi, Hiroshi; Fujiwara, Yoshi; Kaizoji, Taisei

    2007-03-01

    Firm dynamics on a transaction network is considered from the standpoint of econophysics, agent-based simulations, and game theory. In this model, interacting firms rationally invest in a production facility to maximize net present value. We estimate parameters used in the model through empirical analysis of financial and transaction data. We propose two different methods ( analytical method and regression method) to obtain an interaction matrix of firms. On a subset of a real transaction network, we simulate firm's revenue, cost, and fixed asset, which is the accumulated investment for the production facility. The simulation reproduces the quantitative behavior of past revenues and costs within a standard error when we use the interaction matrix estimated by the regression method, in which only transaction pairs are taken into account. Furthermore, the simulation qualitatively reproduces past data of fixed assets.

  18. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.

    PubMed

    Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M

    2015-09-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods. PMID:26158406

  19. An Agent-Based Labor Market Simulation with Endogenous Skill-Demand

    NASA Astrophysics Data System (ADS)

    Gemkow, S.

    This paper considers an agent-based labor market simulation to examine the influence of skills on wages and unemployment rates. Therefore less and highly skilled workers as well as less and highly productive vacancies are implemented. The skill distribution is exogenous whereas the distribution of the less and highly productive vacancies is endogenous. The different opportunities of the skill groups on the labor market are established by skill requirements. This means that a highly productive vacancy can only be filled by a highly skilled unemployed. Different skill distributions, which can also be interpreted as skill-biased technological change, are simulated by incrementing the skill level of highly skilled persons exogenously. This simulation also provides a microeconomic foundation of the matching function often used in theoretical approaches.

  20. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    SciTech Connect

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2014-01-01

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From these five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.

  1. Recent Advances in Agent-Based Tsunami Evacuation Simulations: Case Studies in Indonesia, Thailand, Japan and Peru

    NASA Astrophysics Data System (ADS)

    Mas, Erick; Koshimura, Shunichi; Imamura, Fumihiko; Suppasri, Anawat; Muhari, Abdul; Adriano, Bruno

    2015-12-01

    As confirmed by the extreme tsunami events over the last decade (the 2004 Indian Ocean, 2010 Chile and 2011 Japan tsunami events), mitigation measures and effective evacuation planning are needed to reduce disaster risks. Modeling tsunami evacuations is an alternative means to analyze evacuation plans and possible scenarios of evacuees' behaviors. In this paper, practical applications of an agent-based tsunami evacuation model are presented to demonstrate the contributions that agent-based modeling has added to tsunami evacuation simulations and tsunami mitigation efforts. A brief review of previous agent-based evacuation models in the literature is given to highlight recent progress in agent-based methods. Finally, challenges are noted for bridging gaps between geoscience and social science within the agent-based approach for modeling tsunami evacuations.

  2. Modeling the Information Age Combat Model: An Agent-Based Simulation of Network Centric Operations

    NASA Technical Reports Server (NTRS)

    Deller, Sean; Rabadi, Ghaith A.; Bell, Michael I.; Bowling, Shannon R.; Tolk, Andreas

    2010-01-01

    The Information Age Combat Model (IACM) was introduced by Cares in 2005 to contribute to the development of an understanding of the influence of connectivity on force effectiveness that can eventually lead to quantitative prediction and guidelines for design and employment. The structure of the IACM makes it clear that the Perron-Frobenius Eigenvalue is a quantifiable metric with which to measure the organization of a networked force. The results of recent experiments presented in Deller, et aI., (2009) indicate that the value of the Perron-Frobenius Eigenvalue is a significant measurement of the performance of an Information Age combat force. This was accomplished through the innovative use of an agent-based simulation to model the IACM and represents an initial contribution towards a new generation of combat models that are net-centric instead of using the current platform-centric approach. This paper describes the intent, challenges, design, and initial results of this agent-based simulation model.

  3. Multi-Agent-Based Simulation of a Complex Ecosystem of Mental Health Care.

    PubMed

    Kalton, Alan; Falconer, Erin; Docherty, John; Alevras, Dimitris; Brann, David; Johnson, Kyle

    2016-02-01

    This paper discusses the creation of an Agent-Based Simulation that modeled the introduction of care coordination capabilities into a complex system of care for patients with Serious and Persistent Mental Illness. The model describes the engagement between patients and the medical, social and criminal justice services they interact with in a complex ecosystem of care. We outline the challenges involved in developing the model, including process mapping and the collection and synthesis of data to support parametric estimates, and describe the controls built into the model to support analysis of potential changes to the system. We also describe the approach taken to calibrate the model to an observable level of system performance. Preliminary results from application of the simulation are provided to demonstrate how it can provide insights into potential improvements deriving from introduction of care coordination technology. PMID:26590977

  4. Promoting Conceptual Change for Complex Systems Understanding: Outcomes of an Agent-Based Participatory Simulation

    NASA Astrophysics Data System (ADS)

    Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.

    2016-08-01

    Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence ( r = .26, p = .03), Order ( r = .37, p = .002), and Tradeoffs ( r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.

  5. AN AGENT-BASED SIMULATION STUDY OF A COMPLEX ADAPTIVE COLLABORATION NETWORK

    SciTech Connect

    Ozmen, Ozgur; Smith, Jeffrey; Yilmaz, Levent

    2013-01-01

    One of the most significant problems in organizational scholarship is to discern how social collectives govern, organize, and coordinate the actions of individuals to achieve collective outcomes. The collectives are usually interpreted as complex adaptive systems (CAS). The understanding of CAS is more likely to arise with the help of computer-based simulations. In this tutorial, using agent-based modeling approach, a complex adaptive social communication network model is introduced. The objective is to present the underlying dynamics of the system in a form of computer simulation that enables analyzing the impacts of various mechanisms on network topologies and emergent behaviors. The ultimate goal is to further our understanding of the dynamics in the system and facilitate developing informed policies for decision-makers.

  6. Promoting Conceptual Change for Complex Systems Understanding: Outcomes of an Agent-Based Participatory Simulation

    NASA Astrophysics Data System (ADS)

    Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.

    2016-03-01

    Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence (r = .26, p = .03), Order (r = .37, p = .002), and Tradeoffs (r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.

  7. Understanding the Dynamics of Violent Political Revolutions in an Agent-Based Framework

    PubMed Central

    Moro, Alessandro

    2016-01-01

    This paper develops an agent-based computational model of violent political revolutions in which a subjugated population of citizens and an armed revolutionary organisation attempt to overthrow a central authority and its loyal forces. The model replicates several patterns of rebellion consistent with major historical revolutions, and provides an explanation for the multiplicity of outcomes that can arise from an uprising. The relevance of the heterogeneity of scenarios predicted by the model can be understood by considering the recent experience of the Arab Spring involving several rebellions that arose in an apparently similar way, but resulted in completely different political outcomes: the successful revolution in Tunisia, the failed protests in Saudi Arabia and Bahrain, and civil war in Syria and Libya. PMID:27104855

  8. Understanding the Dynamics of Violent Political Revolutions in an Agent-Based Framework.

    PubMed

    Moro, Alessandro

    2016-01-01

    This paper develops an agent-based computational model of violent political revolutions in which a subjugated population of citizens and an armed revolutionary organisation attempt to overthrow a central authority and its loyal forces. The model replicates several patterns of rebellion consistent with major historical revolutions, and provides an explanation for the multiplicity of outcomes that can arise from an uprising. The relevance of the heterogeneity of scenarios predicted by the model can be understood by considering the recent experience of the Arab Spring involving several rebellions that arose in an apparently similar way, but resulted in completely different political outcomes: the successful revolution in Tunisia, the failed protests in Saudi Arabia and Bahrain, and civil war in Syria and Libya. PMID:27104855

  9. Changing crops in response to climate: virtual Nang Rong, Thailand in an agent based simulation

    PubMed Central

    Malanson, George P.; Verdery, Ashton M.; Walsh, Stephen J.; Sawangdee, Yothin; Heumann, Benjamin W.; McDaniel, Philip M.; Frizzelle, Brian G.; Williams, Nathalie E.; Yao, Xiaozheng; Entwisle, Barbara; Rindfuss, Ronald R.

    2014-01-01

    The effects of extended climatic variability on agricultural land use were explored for the type of system found in villages of northeastern Thailand. An agent based model developed for the Nang Rong district was used to simulate land allotted to jasmine rice, heavy rice, cassava, and sugar cane. The land use choices in the model depended on likely economic outcomes, but included elements of bounded rationality in dependence on household demography. The socioeconomic dynamics are endogenous in the system, and climate changes were added as exogenous drivers. Villages changed their agricultural effort in many different ways. Most villages reduced the amount of land under cultivation, primarily with reduction in jasmine rice, but others did not. The variation in responses to climate change indicates potential sensitivity to initial conditions and path dependence for this type of system. The differences between our virtual villages and the real villages of the region indicate effects of bounded rationality and limits on model applications. PMID:25061240

  10. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  11. Prediction Markets and Beliefs about Climate: Results from Agent-Based Simulations

    NASA Astrophysics Data System (ADS)

    Gilligan, J. M.; John, N. J.; van der Linden, M.

    2015-12-01

    Climate scientists have long been frustrated by persistent doubts a large portion of the public expresses toward the scientific consensus about anthropogenic global warming. The political and ideological polarization of this doubt led Vandenbergh, Raimi, and Gilligan [1] to propose that prediction markets for climate change might influence the opinions of those who mistrust the scientific community but do trust the power of markets.We have developed an agent-based simulation of a climate prediction market in which traders buy and sell future contracts that will pay off at some future year with a value that depends on the global average temperature at that time. The traders form a heterogeneous population with different ideological positions, different beliefs about anthropogenic global warming, and different degrees of risk aversion. We also vary characteristics of the market, including the topology of social networks among the traders, the number of traders, and the completeness of the market. Traders adjust their beliefs about climate according to the gains and losses they and other traders in their social network experience. This model predicts that if global temperature is predominantly driven by greenhouse gas concentrations, prediction markets will cause traders' beliefs to converge toward correctly accepting anthropogenic warming as real. This convergence is largely independent of the structure of the market and the characteristics of the population of traders. However, it may take considerable time for beliefs to converge. Conversely, if temperature does not depend on greenhouse gases, the model predicts that traders' beliefs will not converge. We will discuss the policy-relevance of these results and more generally, the use of agent-based market simulations for policy analysis regarding climate change, seasonal agricultural weather forecasts, and other applications.[1] MP Vandenbergh, KT Raimi, & JM Gilligan. UCLA Law Rev. 61, 1962 (2014).

  12. An agent-based simulation model to study accountable care organizations.

    PubMed

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions. PMID:24715674

  13. Design of a Mobile Agent-Based Adaptive Communication Middleware for Federations of Critical Infrastructure Simulations

    NASA Astrophysics Data System (ADS)

    Görbil, Gökçe; Gelenbe, Erol

    The simulation of critical infrastructures (CI) can involve the use of diverse domain specific simulators that run on geographically distant sites. These diverse simulators must then be coordinated to run concurrently in order to evaluate the performance of critical infrastructures which influence each other, especially in emergency or resource-critical situations. We therefore describe the design of an adaptive communication middleware that provides reliable and real-time one-to-one and group communications for federations of CI simulators over a wide-area network (WAN). The proposed middleware is composed of mobile agent-based peer-to-peer (P2P) overlays, called virtual networks (VNets), to enable resilient, adaptive and real-time communications over unreliable and dynamic physical networks (PNets). The autonomous software agents comprising the communication middleware monitor their performance and the underlying PNet, and dynamically adapt the P2P overlay and migrate over the PNet in order to optimize communications according to the requirements of the federation and the current conditions of the PNet. Reliable communications is provided via redundancy within the communication middleware and intelligent migration of agents over the PNet. The proposed middleware integrates security methods in order to protect the communication infrastructure against attacks and provide privacy and anonymity to the participants of the federation. Experiments with an initial version of the communication middleware over a real-life networking testbed show that promising improvements can be obtained for unicast and group communications via the agent migration capability of our middleware.

  14. Agent-based computer simulation and sirs: building a bridge between basic science and clinical trials.

    PubMed

    An, G

    2001-10-01

    The management of Systemic Inflammatory Response Syndrome (SIRS)/Multiple Organ Failure (MOF) remains the greatest challenge in the field of critical care. There has been uniform difficulty in translating the results of basic science research into effective therapeutic regimes. We propose that this is due in part to a failure to account for the complex, nonlinear nature of the inflammatory process of which SIRS/MOF represents a disordered state. Attempts to manipulate this process without an understanding of the dynamics of the system may potentially produce unintended consequences. Agent-Based Computer Simulation (ABCS) provides a means to synthesize the information acquired from the linear analysis of basic science into a model that preserves the complexity of the inflammatory system. We have constructed an abstracted version of the inflammatory process using an ABCS that is based at the cellular level. Despite its abstraction, the simulation produces non-linear behavior and reproduces the dynamic structure of the inflammatory response. Furthermore, adjustment of the simulation to model one of the unsuccessful initial anti-inflammatory trials of the 1990's demonstrates the adverse outcome that was observed in those clinical trials. It must be emphasized that the current model is extremely abstract and simplified. However, it is hoped that future ABCSs of sufficient sophistication eventually may provide an important bridging tool to translate basic science discoveries into clinical applications. Creating these simulations will require a large collaborative effort, and it is hoped that this paper will stimulate interest in this form of analysis. PMID:11580108

  15. Investigating the role of water in the Diffusion of Cholera using Agent-Based simulation

    NASA Astrophysics Data System (ADS)

    Augustijn, Ellen-Wien; Doldersum, Tom; Augustijn, Denie

    2014-05-01

    Traditionally, cholera was considered to be a waterborne disease. Currently we know that many other factors can contribute to the spread of this disease including human mobility and human behavior. However, the hydrological component in cholera diffusion is significant. The interplay between cholera and water includes bacteria (V. cholera) that survive in the aquatic environment, the possibility that run-off water from dumpsites carries the bacteria to surface water (rivers and lakes), and when the bacteria reach streams they can be carried downstream to infect new locations. Modelling is a very important tool to build theory on the interplay between different types of transmission mechanisms that together are responsible for the spread of Cholera. Agent-based simulation models are very suitable to incorporate behavior at individual level and to reproduce emergence. However, it is more difficult to incorporate the hydrological components in this type of model. In this research we present the hydrological component of an Agent-Based Cholera model developed to study a Cholera epidemic in Kumasi (Ghana) in 2005. The model was calibrated on the relative contribution of each community to the distributed pattern of cholera rather than the absolute number of incidences. Analysis of the results shows that water plays an important role in the diffusion of cholera: 75% of the cholera cases were infected via river water that was contaminated by runoff from the dumpsites. To initiate infections upstream, the probability of environment-to-human transmission seemed to be overestimated compared to what may be expected from literature. Scenario analyses show that there is a strong relation between the epidemic curve and the rainfall. Removing dumpsites that are situated close to the river resulted in a strong decrease in the number of cholera cases. Results are sensitive to the scheduling of the daily activities and the survival time of the cholera bacteria.

  16. Agent-based Modeling to Simulate the Diffusion of Water-Efficient Innovations and the Emergence of Urban Water Sustainability

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Giacomoni, M.; Shafiee, M. E.; Berglund, E.

    2014-12-01

    The sustainability of water resources is threatened by urbanization, as increasing demands deplete water availability, and changes to the landscape alter runoff and the flow regime of receiving water bodies. Utility managers typically manage urban water resources through the use of centralized solutions, such as large reservoirs, which may be limited in their ability balance the needs of urbanization and ecological systems. Decentralized technologies, on the other hand, may improve the health of the water resources system and deliver urban water services. For example, low impact development technologies, such as rainwater harvesting, and water-efficient technologies, such as low-flow faucets and toilets, may be adopted by households to retain rainwater and reduce demands, offsetting the need for new centralized infrastructure. Decentralized technologies may create new complexities in infrastructure and water management, as decentralization depends on community behavior and participation beyond traditional water resources planning. Messages about water shortages and water quality from peers and the water utility managers can influence the adoption of new technologies. As a result, feedbacks between consumers and water resources emerge, creating a complex system. This research develops a framework to simulate the diffusion of water-efficient innovations and the sustainability of urban water resources, by coupling models of households in a community, hydrologic models of a water resources system, and a cellular automata model of land use change. Agent-based models are developed to simulate the land use and water demand decisions of individual households, and behavioral rules are encoded to simulate communication with other agents and adoption of decentralized technologies, using a model of the diffusion of innovation. The framework is applied for an illustrative case study to simulate water resources sustainability over a long-term planning horizon.

  17. Simulation of avascular tumor growth by agent-based game model involving phenotype-phenotype interactions.

    PubMed

    Chen, Yong; Wang, Hengtong; Zhang, Jiangang; Chen, Ke; Li, Yumin

    2015-01-01

    All tumors, both benign and metastatic, undergo an avascular growth stage with nutrients supplied by the surrounding tissue. This avascular growth process is much easier to carry out in more qualitative and quantitative experiments starting from tumor spheroids in vitro with reliable reproducibility. Essentially, this tumor progression would be described as a sequence of phenotypes. Using agent-based simulation in a two-dimensional spatial lattice, we constructed a composite growth model in which the phenotypic behavior of tumor cells depends on not only the local nutrient concentration and cell count but also the game among cells. Our simulation results demonstrated that in silico tumors are qualitatively similar to those observed in tumor spheroid experiments. We also found that the payoffs in the game between two living cell phenotypes can influence the growth velocity and surface roughness of tumors at the same time. Finally, this current model is flexible and can be easily extended to discuss other situations, such as environmental heterogeneity and mutation. PMID:26648395

  18. Simulation of avascular tumor growth by agent-based game model involving phenotype-phenotype interactions

    PubMed Central

    Chen, Yong; Wang, Hengtong; Zhang, Jiangang; Chen, Ke; Li, Yumin

    2015-01-01

    All tumors, both benign and metastatic, undergo an avascular growth stage with nutrients supplied by the surrounding tissue. This avascular growth process is much easier to carry out in more qualitative and quantitative experiments starting from tumor spheroids in vitro with reliable reproducibility. Essentially, this tumor progression would be described as a sequence of phenotypes. Using agent-based simulation in a two-dimensional spatial lattice, we constructed a composite growth model in which the phenotypic behavior of tumor cells depends on not only the local nutrient concentration and cell count but also the game among cells. Our simulation results demonstrated that in silico tumors are qualitatively similar to those observed in tumor spheroid experiments. We also found that the payoffs in the game between two living cell phenotypes can influence the growth velocity and surface roughness of tumors at the same time. Finally, this current model is flexible and can be easily extended to discuss other situations, such as environmental heterogeneity and mutation. PMID:26648395

  19. Parallel Agent-Based Simulations on Clusters of GPUs and Multi-Core Processors

    SciTech Connect

    Aaby, Brandon G; Perumalla, Kalyan S; Seal, Sudip K

    2010-01-01

    An effective latency-hiding mechanism is presented in the parallelization of agent-based model simulations (ABMS) with millions of agents. The mechanism is designed to accommodate the hierarchical organization as well as heterogeneity of current state-of-the-art parallel computing platforms. We use it to explore the computation vs. communication trade-off continuum available with the deep computational and memory hierarchies of extant platforms and present a novel analytical model of the tradeoff. We describe our implementation and report preliminary performance results on two distinct parallel platforms suitable for ABMS: CUDA threads on multiple, networked graphical processing units (GPUs), and pthreads on multi-core processors. Message Passing Interface (MPI) is used for inter-GPU as well as inter-socket communication on a cluster of multiple GPUs and multi-core processors. Results indicate the benefits of our latency-hiding scheme, delivering as much as over 100-fold improvement in runtime for certain benchmark ABMS application scenarios with several million agents. This speed improvement is obtained on our system that is already two to three orders of magnitude faster on one GPU than an equivalent CPU-based execution in a popular simulator in Java. Thus, the overall execution of our current work is over four orders of magnitude faster when executed on multiple GPUs.

  20. [Research on multi-agent based modeling and simulation of hospital system].

    PubMed

    Zhao, Junping; Yang, Hongqiao; Guo, Huayuan; Li, Yi; Zhang, Zhenjiang; Li, Shuzhang

    2010-12-01

    In this paper, the theory of complex adaptive system (CAS) and its modeling method are introduced. The complex characters of the hospital system is analyzed. The agile manufacturing and cell reconstruction technologies are used to reconstruct the hospital system. Then we set forth a research for simulation of hospital system based on the methodology of Multi-Agent technology and high level architecture (HLA). Finally, a simulation framework based on HLA for hospital system is presented. PMID:21374992

  1. An agent-based simulation of extirpation of Ceratitis capitata applied to invasions in California.

    PubMed

    Manoukis, Nicholas C; Hoffman, Kevin

    2014-01-01

    We present an agent-based simulation (ABS) of Ceratitis capitata ("Medfly") developed for estimating the time to extirpation of this pest in areas where quarantines and eradication treatments were immediately imposed. We use the ABS, implemented in the program MED-FOES, to study seven different outbreaks that occurred in Southern California from 2008 to 2010. Results are compared with the length of intervention and quarantine imposed by the State, based on a linear developmental model (thermal unit accumulation, or "degree-day"). MED-FOES is a useful tool for invasive species managers as it incorporates more information from the known biology of the Medfly, and includes the important feature of being demographically explicit, providing significant improvements over simple degree-day calculations. While there was general agreement between the length of quarantine by degree-day and the time to extirpation indicated by MED-FOES, the ABS suggests that the margin of safety varies among cases and that in two cases the quarantine may have been excessively long. We also examined changes in the number of individuals over time in MED-FOES and conducted a sensitivity analysis for one of the outbreaks to explore the role of various input parameters on simulation outcomes. While our implementation of the ABS in this work is motivated by C. capitata and takes extirpation as a postulate, the simulation is very flexible and can be used to study a variety of questions on the invasion biology of pest insects and methods proposed to manage or eradicate such species. PMID:24563646

  2. A spatial agent-based model for the simulation of adults' daily walking within a city.

    PubMed

    Yang, Yong; Diez Roux, Ana V; Auchincloss, Amy H; Rodriguez, Daniel A; Brown, Daniel G

    2011-03-01

    Environmental effects on walking behavior have received attention in recent years because of the potential for policy interventions to increase population levels of walking. Most epidemiologic studies describe associations of walking behavior with environmental features. These analyses ignore the dynamic processes that shape walking behaviors. A spatial agent-based model (ABM) was developed to simulate people's walking behaviors within a city. Each individual was assigned properties such as age, SES, walking ability, attitude toward walking and a home location. Individuals perform different activities on a regular basis such as traveling for work, for basic needs, and for leisure. Whether an individual walks and the amount she or he walks is a function of distance to different activities and her/his walking ability and attitude toward walking. An individual's attitude toward walking evolves over time as a function of past experiences, walking of others along the walking route, limits on distances walked per day, and attitudes toward walking of the other individuals within her/his social network. The model was calibrated and used to examine the contributions of land use and safety to socioeconomic differences in walking. With further refinement and validation, ABMs may help to better understand the determinants of walking and identify the most promising interventions to increase walking. PMID:21335269

  3. A Spatial Agent-Based Model for the Simulation of Adults’ Daily Walking Within a City

    PubMed Central

    Yang, Yong; Roux, Ana V. Diez; Auchincloss, Amy H.; Rodriguez, Daniel A.; Brown, Daniel G.

    2012-01-01

    Environmental effects on walking behavior have received attention in recent years because of the potential for policy interventions to increase population levels of walking. Most epidemiologic studies describe associations of walking behavior with environmental features. These analyses ignore the dynamic processes that shape walking behaviors. A spatial agent-based model (ABM) was developed to simulate peoples’ walking behaviors within a city. Each individual was assigned properties such as age, SES, walking ability, attitude toward walking and a home location. Individuals perform different activities on a regular basis such as traveling for work, for shopping, and for recreation. Whether an individual walks and the amount she or he walks is a function distance to different activities and her or his walking ability and attitude toward walking. An individual’s attitude toward walking evolves over time as a function of past experiences, walking of others along the walking route, limits on distances walked per day, and attitudes toward walking of the other individuals within her/his social network. The model was calibrated and used to examine the contributions of land use and safety to socioeconomic differences in walking. With further refinement and validation, ABMs may help to better understand the determinants of walking and identify the most promising interventions to increase walking. PMID:21335269

  4. Biophysically Realistic Filament Bending Dynamics in Agent-Based Biological Simulation

    PubMed Central

    Alberts, Jonathan B.

    2009-01-01

    An appealing tool for study of the complex biological behaviors that can emerge from networks of simple molecular interactions is an agent-based, computational simulation that explicitly tracks small-scale local interactions – following thousands to millions of states through time. For many critical cell processes (e.g. cytokinetic furrow specification, nuclear centration, cytokinesis), the flexible nature of cytoskeletal filaments is likely to be critical. Any computer model that hopes to explain the complex emergent behaviors in these processes therefore needs to encode filament flexibility in a realistic manner. Here I present a numerically convenient and biophysically realistic method for modeling cytoskeletal filament flexibility in silico. Each cytoskeletal filament is represented by a series of rigid segments linked end-to-end in series with a variable attachment point for the translational elastic element. This connection scheme allows an empirically tuning, for a wide range of segment sizes, viscosities, and time-steps, that endows any filament species with the experimentally observed (or theoretically expected) static force deflection, relaxation time-constant, and thermal writhing motions. I additionally employ a unique pair of elastic elements – one representing the axial and the other the bending rigidity– that formulate the restoring force in terms of single time-step constraint resolution. This method is highly local –adjacent rigid segments of a filament only interact with one another through constraint forces—and is thus well-suited to simulations in which arbitrary additional forces (e.g. those representing interactions of a filament with other bodies or cross-links / entanglements between filaments) may be present. Implementation in code is straightforward; Java source code is available at www.celldynamics.org. PMID:19283085

  5. An Agent-based Simulation Model for C. difficile Infection Control

    PubMed Central

    Codella, James; Safdar, Nasia; Heffernan, Rick; Alagoz, Oguzhan

    2014-01-01

    Background. Control of C. difficile infection (CDI) is an increasingly difficult problem for healthcare institutions. There are commonly recommended strategies to combat CDI transmission such as oral vancomycin for CDI treatment, increased hand hygiene with soap and water for healthcare workers, daily environmental disinfection of infected patient rooms, and contact isolation of diseased patients. However, the efficacy of these strategies, particularly for endemic CDI, has not been well studied. The objective of this research is to develop a valid agent-based simulation model (ABM) to study C. difficile transmission and control in a mid-sized hospital. Methods. We develop an ABM of a mid-sized hospital with agents such as patients, healthcare workers, and visitors. We model the natural progression of CDI in a patient using a Markov chain and the transmission of CDI through agent and environmental interactions. We derive input parameters from aggregate patient data from the 2007-2010 Wisconsin Hospital Association and published medical literature. We define a calibration process, which we use to estimate transition probabilities of the Markov model by comparing simulation results to benchmark values found in published literature. Results. Comparing CDI control strategies implemented individually, routine bleach disinfection of CDI+ patient rooms provides the largest reduction in nosocomial asymptomatic colonizations (21.8%) and nosocomial CDIs (42.8%). Additionally, vancomycin treatment provides the largest reduction in relapse CDIs (41.9%), CDI-related mortalities (68.5%), and total patient LOS (21.6%). Conclusion. We develop a generalized ABM for CDI control that can be customized and further expanded to specific institutions and/or scenarios. Additionally, we estimate transition probabilities for a Markov model of natural CDI progression in a patient through calibration. PMID:25112595

  6. Agent-based evacuation simulation for spatial allocation assessment of urban shelters

    NASA Astrophysics Data System (ADS)

    Yu, Jia; Wen, Jiahong; Jiang, Yong

    2015-12-01

    The construction of urban shelters is one of the most important work in urban planning and disaster prevention. The spatial allocation assessment is a fundamental pre-step for spatial location-allocation of urban shelters. This paper introduces a new method which makes use of agent-based technology to implement evacuation simulation so as to conduct dynamic spatial allocation assessment of urban shelters. The method can not only accomplish traditional geospatial evaluation for urban shelters, but also simulate the evacuation process of the residents to shelters. The advantage of utilizing this method lies into three aspects: (1) the evacuation time of each citizen from a residential building to the shelter can be estimated more reasonably; (2) the total evacuation time of all the residents in a region is able to be obtained; (3) the road congestions in evacuation in sheltering can be detected so as to take precautionary measures to prevent potential risks. In this study, three types of agents are designed: shelter agents, government agents and resident agents. Shelter agents select specified land uses as shelter candidates for different disasters. Government agents delimitate the service area of each shelter, in other words, regulate which shelter a person should take, in accordance with the administrative boundaries and road distance between the person's position and the location of the shelter. Resident agents have a series of attributes, such as ages, positions, walking speeds, and so on. They also have several behaviors, such as reducing speed when walking in the crowd, helping old people and children, and so on. Integrating these three types of agents which are correlated with each other, evacuation procedures can be simulated and dynamic allocation assessment of shelters will be achieved. A case study in Jing'an District, Shanghai, China, was conducted to demonstrate the feasibility of the method. A scenario of earthquake disaster which occurs in nighttime

  7. Age-correlated stress resistance improves fitness of yeast: support from agent-based simulations

    PubMed Central

    2014-01-01

    Background Resistance to stress is often heterogeneous among individuals within a population, which helps protect against intermittent stress (bet hedging). This is also the case for heat shock resistance in the budding yeast Saccharomyces cerevisiae. Interestingly, the resistance appears to be continuously distributed (vs. binary, switch-like) and correlated with replicative age (vs. random). Older, slower-growing cells are more resistant than younger, faster-growing ones. Is there a fitness benefit to age-correlated stress resistance? Results Here this hypothesis is explored using a simple agent-based model, which simulates a population of individual cells that grow and replicate. Cells age by accumulating damage, which lowers their growth rate. They synthesize trehalose at a metabolic cost, which helps protect against heat shock. Proteins Tsl1 and Tps3 (trehalose synthase complex regulatory subunit TSL1 and TPS3) represent the trehalose synthesis complex and they are expressed using constant, age-dependent and stochastic terms. The model was constrained by calibration and comparison to data from the literature, including individual-based observations obtained using high-throughput microscopy and flow cytometry. A heterogeneity network was developed, which highlights the predominant sources and pathways of resistance heterogeneity. To determine the best trehalose synthesis strategy, model strains with different Tsl1/Tps3 expression parameters were placed in competition in an environment with intermittent heat shocks. Conclusions For high severities and low frequencies of heat shock, the winning strain used an age-dependent bet hedging strategy, which shows that there can be a benefit to age-correlated stress resistance. The study also illustrates the utility of combining individual-based observations and modeling to understand mechanisms underlying population heterogeneity, and the effect on fitness. PMID:24529069

  8. An operational epidemiological model for calibrating agent-based simulations of pandemic influenza outbreaks.

    PubMed

    Prieto, D; Das, T K

    2016-03-01

    Uncertainty of pandemic influenza viruses continue to cause major preparedness challenges for public health policymakers. Decisions to mitigate influenza outbreaks often involve tradeoff between the social costs of interventions (e.g., school closure) and the cost of uncontrolled spread of the virus. To achieve a balance, policymakers must assess the impact of mitigation strategies once an outbreak begins and the virus characteristics are known. Agent-based (AB) simulation is a useful tool for building highly granular disease spread models incorporating the epidemiological features of the virus as well as the demographic and social behavioral attributes of tens of millions of affected people. Such disease spread models provide excellent basis on which various mitigation strategies can be tested, before they are adopted and implemented by the policymakers. However, to serve as a testbed for the mitigation strategies, the AB simulation models must be operational. A critical requirement for operational AB models is that they are amenable for quick and simple calibration. The calibration process works as follows: the AB model accepts information available from the field and uses those to update its parameters such that some of its outputs in turn replicate the field data. In this paper, we present our epidemiological model based calibration methodology that has a low computational complexity and is easy to interpret. Our model accepts a field estimate of the basic reproduction number, and then uses it to update (calibrate) the infection probabilities in a way that its effect combined with the effects of the given virus epidemiology, demographics, and social behavior results in an infection pattern yielding a similar value of the basic reproduction number. We evaluate the accuracy of the calibration methodology by applying it for an AB simulation model mimicking a regional outbreak in the US. The calibrated model is shown to yield infection patterns closely replicating

  9. Understanding coupled natural and human systems on fire prone landscapes: integrating wildfire simulation into an agent based planning system.

    NASA Astrophysics Data System (ADS)

    Barros, Ana; Ager, Alan; Preisler, Haiganoush; Day, Michelle; Spies, Tom; Bolte, John

    2015-04-01

    Agent-based models (ABM) allow users to examine the long-term effects of agent decisions in complex systems where multiple agents and processes interact. This framework has potential application to study the dynamics of coupled natural and human systems where multiple stimuli determine trajectories over both space and time. We used Envision, a landscape based ABM, to analyze long-term wildfire dynamics in a heterogeneous, multi-owner landscape in Oregon, USA. Landscape dynamics are affected by land management policies, actors decisions, and autonomous processes such as vegetation succession, wildfire, or at a broader scale, climate change. Key questions include: 1) How are landscape dynamics influenced by policies and institutions, and 2) How do land management policies and actor decisions interact to produce intended and unintended consequences with respect to wildfire on fire-prone landscapes. Applying Envision to address these questions required the development of a wildfire module that could accurately simulate wildfires on the heterogeneous landscapes within the study area in terms of replicating historical fire size distribution, spatial distribution and fire intensity. In this paper we describe the development and testing of a mechanistic fire simulation system within Envision and application of the model on a 3.2 million fire prone landscape in central Oregon USA. The core fire spread equations use the Minimum Travel Time algorithm developed by M Finney. The model operates on a daily time step and uses a fire prediction system based on the relationship between energy release component and historical fires. Specifically, daily wildfire probabilities and sizes are generated from statistical analyses of historical fires in relation to daily ERC values. The MTT was coupled with the vegetation dynamics module in Envision to allow communication between the respective subsystem and effectively model fire effects and vegetation dynamics after a wildfire. Canopy and

  10. Modelling Temporal Schedule of Urban Trains Using Agent-Based Simulation and NSGA2-BASED Multiobjective Optimization Approaches

    NASA Astrophysics Data System (ADS)

    Sahelgozin, M.; Alimohammadi, A.

    2015-12-01

    Increasing distances between locations of residence and services leads to a large number of daily commutes in urban areas. Developing subway systems has been taken into consideration of transportation managers as a response to this huge amount of travel demands. In developments of subway infrastructures, representing a temporal schedule for trains is an important task; because an appropriately designed timetable decreases Total passenger travel times, Total Operation Costs and Energy Consumption of trains. Since these variables are not positively correlated, subway scheduling is considered as a multi-criteria optimization problem. Therefore, proposing a proper solution for subway scheduling has been always a controversial issue. On the other hand, research on a phenomenon requires a summarized representation of the real world that is known as Model. In this study, it is attempted to model temporal schedule of urban trains that can be applied in Multi-Criteria Subway Schedule Optimization (MCSSO) problems. At first, a conceptual framework is represented for MCSSO. Then, an agent-based simulation environment is implemented to perform Sensitivity Analysis (SA) that is used to extract the interrelations between the framework components. These interrelations is then taken into account in order to construct the proposed model. In order to evaluate performance of the model in MCSSO problems, Tehran subway line no. 1 is considered as the case study. Results of the study show that the model was able to generate an acceptable distribution of Pareto-optimal solutions which are applicable in the real situations while solving a MCSSO is the goal. Also, the accuracy of the model in representing the operation of subway systems was significant.

  11. An integrated modeling framework of socio-economic, biophysical, and hydrological processes in Midwest landscapes: Remote sensing data, agro-hydrological model, and agent-based model

    NASA Astrophysics Data System (ADS)

    Ding, Deng

    Intensive human-environment interactions are taking place in Midwestern agricultural systems. An integrated modeling framework is suitable for predicting dynamics of key variables of the socio-economic, biophysical, hydrological processes as well as exploring the potential transitions of system states in response to changes of the driving factors. The purpose of this dissertation is to address issues concerning the interacting processes and consequent changes in land use, water balance, and water quality using an integrated modeling framework. This dissertation is composed of three studies in the same agricultural watershed, the Clear Creek watershed in East-Central Iowa. In the first study, a parsimonious hydrologic model, the Threshold-Exceedance-Lagrangian Model (TELM), is further developed into RS-TELM (Remote Sensing TELM) to integrate remote sensing vegetation data for estimating evapotranspiration. The goodness of fit of RS-TELM is comparable to a well-calibrated SWAT (Soil and Water Assessment Tool) and even slightly superior in capturing intra-seasonal variability of stream flow. The integration of RS LAI (Leaf Area Index) data improves the model's performance especially over the agriculture dominated landscapes. The input of rainfall datasets with spatially explicit information plays a critical role in increasing the model's goodness of fit. In the second study, an agent-based model is developed to simulate farmers' decisions on crop type and fertilizer application in response to commodity and biofuel crop prices. The comparison between simulated crop land percentage and crop rotations with satellite-based land cover data suggest that farmers may be underestimating the effects that continuous corn production has on yields (yield drag). The simulation results given alternative market scenarios based on a survey of agricultural land owners and operators in the Clear Creek Watershed show that, farmers see cellulosic biofuel feedstock production in the form

  12. Proposal of Classification Method of Time Series Data in International Emissions Trading Market Using Agent-based Simulation

    NASA Astrophysics Data System (ADS)

    Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi

    This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.

  13. Simulating Land-Use Change using an Agent-Based Land Transaction Model

    NASA Astrophysics Data System (ADS)

    Bakker, M. M.; van Dijk, J.; Alam, S. J.

    2013-12-01

    In the densely populated cultural landscapes of Europe, the vast majority of all land is owned by private parties, be it farmers (the majority), nature organizations, property developers, or citizens. Therewith, the vast majority of all land-use change arises from land transactions between different owner types: successful farms expand at the expense of less successful farms, and meanwhile property developers, individual citizens, and nature organizations also actively purchase land. These land transactions are driven by specific properties of the land, by governmental policies, and by the (economic) motives of both buyers and sellers. Climate/global change can affect these drivers at various scales: at the local scale changes in hydrology can make certain land less or more desirable; at the global scale the agricultural markets will affect motives of farmers to buy or sell land; while at intermediate (e.g. provincial) scales property developers and nature conservationists may be encouraged or discouraged to purchase land. The cumulative result of all these transactions becomes manifest in changing land-use patterns, and consequent environmental responses. Within the project Climate Adaptation for Rural Areas an agent-based land-use model was developed that explores the future response of individual land users to climate change, within the context of wider global change (i.e. policy and market change). It simulates the exchange of land among farmers and between farmers and nature organizations and property developers, for a specific case study area in the east of the Netherlands. Results show that local impacts of climate change can result in a relative stagnation in the land market in waterlogged areas. Furthermore, the increase in dairying at the expense of arable cultivation - as has been observed in the area in the past - is slowing down as arable produce shows a favourable trend in the agricultural world market. Furthermore, budgets for nature managers are

  14. Impact of Different Policies on Unhealthy Dietary Behaviors in an Urban Adult Population: An Agent-Based Simulation Model

    PubMed Central

    Giabbanelli, Philippe J.; Arah, Onyebuchi A.; Zimmerman, Frederick J.

    2014-01-01

    Objectives. Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. Methods. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Results. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Conclusions. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems. PMID:24832414

  15. Multi-Agent Based Simulation of Optimal Urban Land Use Allocation in the Middle Reaches of the Yangtze River, China

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Huang, W.; Jin, W.; Li, S.

    2016-06-01

    The optimization of land-use allocation is one of important approaches to achieve regional sustainable development. This study selects Chang-Zhu-Tan agglomeration as study area and proposed a new land use optimization allocation model. Using multi-agent based simulation model, the future urban land use optimization allocation was simulated in 2020 and 2030 under three different scenarios. This kind of quantitative information about urban land use optimization allocation and urban expansions in future would be of great interest to urban planning, water and land resource management, and climate change research.

  16. Evaluating environmental strategies in a textile printing and dyeing enterprise by an agent-based simulation model

    NASA Astrophysics Data System (ADS)

    Gao, Lei; Ding, Yongsheng; Li, Fang

    2013-05-01

    To improve the capabilities of saving energy and reducing pollutant emission of textile printing and dyeing (PD) industry, this article presents a novel agent-based simulation model for assessing the impacts of environmental strategies on a PD enterprise. Two typical PD enterprises in China are simulated with different modelling granularities: one is at a module level, while the other is at an enterprise level. The module-level simulation model depicts detailed production processes in a PD enterprise and evaluates five candidate strategies on their capabilities of improving energy usage and waste emission. The enterprise-level simulation model views a PD enterprise as an agent and assesses three tax strategies for waste discharge. The simulation results show that the proposed general model could be a valuable tool to explore potential solutions to saving energy and reducing waste emission in PD enterprises, after being calibrated to a real case.

  17. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    PubMed

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  18. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. PMID:24705073

  19. Linking Bayesian and Agent-Based Models to Simulate Complex Social-Ecological Systems in the Sonoran Desert

    NASA Astrophysics Data System (ADS)

    Pope, A.; Gimblett, R.

    2013-12-01

    Interdependencies of ecologic, hydrologic, and social systems challenge traditional approaches to natural resource management in semi-arid regions. As a complex social-ecological system, water demands in the Sonoran Desert from agricultural and urban users often conflicts with water needs for its ecologically-significant riparian corridors. To explore this system, we developed an agent-based model to simulate complex feedbacks between human decisions and environmental conditions. Cognitive mapping in conjunction with stakeholder participation produced a Bayesian model of conditional probabilities of local human decision-making processes resulting to changes in water demand. Probabilities created in the Bayesian model were incorporated into the agent-based model, so that each agent had a unique probability to make a positive decision based on its perceived environment at each point in time and space. By using a Bayesian approach, uncertainty in the human decision-making process could be incorporated. The spatially-explicit agent-based model simulated changes in depth-to-groundwater by well pumping based on an agent's water demand. Depth-to-groundwater was then used as an indicator of unique vegetation guilds within the riparian corridor. Each vegetation guild provides varying levels of ecosystem services, the changes of which, along with changes in depth-to-groundwater, feedback to influence agent behavior. Using this modeling approach allowed us to examine resilience of semi-arid riparian corridors and agent behavior under various scenarios. The insight provided by the model contributes to understanding how specific interventions may alter the complex social-ecological system in the future.

  20. Can human-like Bots control collective mood: agent-based simulations of online chats

    NASA Astrophysics Data System (ADS)

    Tadić, Bosiljka; Šuvakov, Milovan

    2013-10-01

    Using an agent-based modeling approach, in this paper, we study self-organized dynamics of interacting agents in the presence of chat Bots. Different Bots with tunable ‘human-like’ attributes, which exchange emotional messages with agents, are considered, and the collective emotional behavior of agents is quantitatively analyzed. In particular, using detrended fractal analysis we determine persistent fluctuations and temporal correlations in time series of agent activity and statistics of avalanches carrying emotional messages of agents when Bots favoring positive/negative affects are active. We determine the impact of Bots and identify parameters that can modulate that impact. Our analysis suggests that, by these measures, the emotional Bots induce collective emotion among interacting agents by suitably altering the fractal characteristics of the underlying stochastic process. Positive emotion Bots are slightly more effective than negative emotion Bots. Moreover, Bots which periodically alternate between positive and negative emotion can enhance fluctuations in the system, leading to avalanches of agent messages that are reminiscent of self-organized critical states.

  1. Evaluating the effect of human activity patterns on air pollution exposure using an integrated field-based and agent-based modelling framework

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; Beelen, Rob M. J.; de Bakker, Merijn P.; Karssenberg, Derek

    2015-04-01

    Constructing spatio-temporal numerical models to support risk assessment, such as assessing the exposure of humans to air pollution, often requires the integration of field-based and agent-based modelling approaches. Continuous environmental variables such as air pollution are best represented using the field-based approach which considers phenomena as continuous fields having attribute values at all locations. When calculating human exposure to such pollutants it is, however, preferable to consider the population as a set of individuals each with a particular activity pattern. This would allow to account for the spatio-temporal variation in a pollutant along the space-time paths travelled by individuals, determined, for example, by home and work locations, road network, and travel times. Modelling this activity pattern requires an agent-based or individual based modelling approach. In general, field- and agent-based models are constructed with the help of separate software tools, while both approaches should play together in an interacting way and preferably should be combined into one modelling framework, which would allow for efficient and effective implementation of models by domain specialists. To overcome this lack in integrated modelling frameworks, we aim at the development of concepts and software for an integrated field-based and agent-based modelling framework. Concepts merging field- and agent-based modelling were implemented by extending PCRaster (http://www.pcraster.eu), a field-based modelling library implemented in C++, with components for 1) representation of discrete, mobile, agents, 2) spatial networks and algorithms by integrating the NetworkX library (http://networkx.github.io), allowing therefore to calculate e.g. shortest routes or total transport costs between locations, and 3) functions for field-network interactions, allowing to assign field-based attribute values to networks (i.e. as edge weights), such as aggregated or averaged

  2. Real-Time Agent-Based Modeling Simulation with in-situ Visualization of Complex Biological Systems

    PubMed Central

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.

    2016-01-01

    We present an efficient and scalable scheme for implementing agent-based modeling (ABM) simulation with In Situ visualization of large complex systems on heterogeneous computing platforms. The scheme is designed to make optimal use of the resources available on a heterogeneous platform consisting of a multicore CPU and a GPU, resulting in minimal to no resource idle time. Furthermore, the scheme was implemented under a client-server paradigm that enables remote users to visualize and analyze simulation data as it is being generated at each time step of the model. Performance of a simulation case study of vocal fold inflammation and wound healing with 3.8 million agents shows 35× and 7× speedup in execution time over single-core and multi-core CPU respectively. Each iteration of the model took less than 200 ms to simulate, visualize and send the results to the client. This enables users to monitor the simulation in real-time and modify its course as needed. PMID:27547508

  3. Agent-Based Simulations of Malaria Transmissions with Applications to a Study Site in Thailand

    NASA Technical Reports Server (NTRS)

    Kiang, Richard K.; Adimi, Farida; Zollner, Gabriela E.; Coleman, Russell E.

    2006-01-01

    The dynamics of malaria transmission are driven by environmental, biotic and socioeconomic factors. Because of the geographic dependency of these factors and the complex interactions among them, it is difficult to generalize the key factors that perpetuate or intensify malaria transmission. Methods: Discrete event simulations were used for modeling the detailed interactions among the vector life cycle, sporogonic cycle and human infection cycle, under the explicit influences of selected extrinsic and intrinsic factors. Meteorological and environmental parameters may be derived from satellite data. The output of the model includes the individual infection status and the quantities normally observed in field studies, such as mosquito biting rates, sporozoite infection rates, gametocyte prevalence and incidence. Results were compared with mosquito vector and human malaria data acquired over 4.5 years (June 1999 - January 2004) in Kong Mong Tha, a remote village in Kanchanaburi Province, western Thailand. Results: Three years of transmissions of vivax and falciparum malaria were simulated for a hypothetical hamlet with approximately 1/7 of the study site population. The model generated results for a number of scenarios, including applications of larvicide and insecticide, asymptomatic cases receiving or not receiving treatment, blocking malaria transmission in mosquito vectors, and increasing the density of farm (host) animals in the hamlet. Transmission characteristics and trends in the simulated results are comparable to actual data collected at the study site.

  4. Investigation of Simulated Trading — A multi agent based trading system for optimization purposes

    NASA Astrophysics Data System (ADS)

    Schneider, Johannes J.

    2010-07-01

    Some years ago, Bachem, Hochstättler, and Malich proposed a heuristic algorithm called Simulated Trading for the optimization of vehicle routing problems. Computational agents place buy-orders and sell-orders for customers to be handled at a virtual financial market, the prices of the orders depending on the costs of inserting the customer in the tour or for his removal. According to a proposed rule set, the financial market creates a buy-and-sell graph for the various orders in the order book, intending to optimize the overall system. Here I present a thorough investigation for the application of this algorithm to the traveling salesman problem.

  5. ActivitySim: large-scale agent based activity generation for infrastructure simulation

    SciTech Connect

    Gali, Emmanuel; Eidenbenz, Stephan; Mniszewski, Sue; Cuellar, Leticia; Teuscher, Christof

    2008-01-01

    The United States' Department of Homeland Security aims to model, simulate, and analyze critical infrastructure and their interdependencies across multiple sectors such as electric power, telecommunications, water distribution, transportation, etc. We introduce ActivitySim, an activity simulator for a population of millions of individual agents each characterized by a set of demographic attributes that is based on US census data. ActivitySim generates daily schedules for each agent that consists of a sequence of activities, such as sleeping, shopping, working etc., each being scheduled at a geographic location, such as businesses or private residences that is appropriate for the activity type and for the personal situation of the agent. ActivitySim has been developed as part of a larger effort to understand the interdependencies among national infrastructure networks and their demand profiles that emerge from the different activities of individuals in baseline scenarios as well as emergency scenarios, such as hurricane evacuations. We present the scalable software engineering principles underlying ActivitySim, the socia-technical modeling paradigms that drive the activity generation, and proof-of-principle results for a scenario in the Twin Cities, MN area of 2.6 M agents.

  6. The contribution of agent-based simulations to conservation management on a Natura 2000 site.

    PubMed

    Dupont, Hélène; Gourmelon, Françoise; Rouan, Mathias; Le Viol, Isabelle; Kerbiriou, Christian

    2016-03-01

    The conservation of biodiversity today must include the participation and support of local stakeholders. Natura 2000 can be considered as a conservation system that, in its application in most EU countries, relies on the participation of local stakeholders. Our study proposes a scientific method for participatory modelling, with the aim of contributing to the conservation management of habitats and species at a Natura 2000 site (Crozon Peninsula, Bretagne, France) that is representative of in landuse changes in coastal areas. We make use of companion modelling and its associated tools (scenario-planning, GIS, multi-agent modelling and simulations) to consider possible futures through the co-construction of management scenarios and the understanding of their consequences on different indicators of biodiversity status (habitats, avifauna, flora). The maintenance of human activities as they have been carried out since the creation of the Natura 2000s zone allows the biodiversity values to remain stable. Extensive agricultural activities have been shown to be essential to this maintenance, whereas management sustained by the multiplication of conservation actions brings about variable results according to the indicators. None of the scenarios has a positive incidence on the set of indicators. However, an understanding of the modelling system and the results of the simulations allow for the refining of the selection of conservation actions in relation to the species to be preserved. PMID:26696603

  7. Evolutionary Agent-Based Simulation of the Introduction of New Technologies in Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Yliniemi, Logan; Agogino, Adrian K.; Tumer, Kagan

    2014-01-01

    Accurate simulation of the effects of integrating new technologies into a complex system is critical to the modernization of our antiquated air traffic system, where there exist many layers of interacting procedures, controls, and automation all designed to cooperate with human operators. Additions of even simple new technologies may result in unexpected emergent behavior due to complex human/ machine interactions. One approach is to create high-fidelity human models coming from the field of human factors that can simulate a rich set of behaviors. However, such models are difficult to produce, especially to show unexpected emergent behavior coming from many human operators interacting simultaneously within a complex system. Instead of engineering complex human models, we directly model the emergent behavior by evolving goal directed agents, representing human users. Using evolution we can predict how the agent representing the human user reacts given his/her goals. In this paradigm, each autonomous agent in a system pursues individual goals, and the behavior of the system emerges from the interactions, foreseen or unforeseen, between the agents/actors. We show that this method reflects the integration of new technologies in a historical case, and apply the same methodology for a possible future technology.

  8. An Agent-Based Simulation for Investigating the Impact of Stereotypes on Task-Oriented Group Formation

    NASA Astrophysics Data System (ADS)

    Maghami, Mahsa; Sukthankar, Gita

    In this paper, we introduce an agent-based simulation for investigating the impact of social factors on the formation and evolution of task-oriented groups. Task-oriented groups are created explicitly to perform a task, and all members derive benefits from task completion. However, even in cases when all group members act in a way that is locally optimal for task completion, social forces that have mild effects on choice of associates can have a measurable impact on task completion performance. In this paper, we show how our simulation can be used to model the impact of stereotypes on group formation. In our simulation, stereotypes are based on observable features, learned from prior experience, and only affect an agent's link formation preferences. Even without assuming stereotypes affect the agents' willingness or ability to complete tasks, the long-term modifications that stereotypes have on the agents' social network impair the agents' ability to form groups with sufficient diversity of skills, as compared to agents who form links randomly. An interesting finding is that this effect holds even in cases where stereotype preference and skill existence are completely uncorrelated.

  9. The Basic Immune Simulator: An agent-based model to study the interactions between innate and adaptive immunity

    PubMed Central

    Folcik, Virginia A; An, Gary C; Orosz, Charles G

    2007-01-01

    Background We introduce the Basic Immune Simulator (BIS), an agent-based model created to study the interactions between the cells of the innate and adaptive immune system. Innate immunity, the initial host response to a pathogen, generally precedes adaptive immunity, which generates immune memory for an antigen. The BIS simulates basic cell types, mediators and antibodies, and consists of three virtual spaces representing parenchymal tissue, secondary lymphoid tissue and the lymphatic/humoral circulation. The BIS includes a Graphical User Interface (GUI) to facilitate its use as an educational and research tool. Results The BIS was used to qualitatively examine the innate and adaptive interactions of the immune response to a viral infection. Calibration was accomplished via a parameter sweep of initial agent population size, and comparison of simulation patterns to those reported in the basic science literature. The BIS demonstrated that the degree of the initial innate response was a crucial determinant for an appropriate adaptive response. Deficiency or excess in innate immunity resulted in excessive proliferation of adaptive immune cells. Deficiency in any of the immune system components increased the probability of failure to clear the simulated viral infection. Conclusion The behavior of the BIS matches both normal and pathological behavior patterns in a generic viral infection scenario. Thus, the BIS effectively translates mechanistic cellular and molecular knowledge regarding the innate and adaptive immune response and reproduces the immune system's complex behavioral patterns. The BIS can be used both as an educational tool to demonstrate the emergence of these patterns and as a research tool to systematically identify potential targets for more effective treatment strategies for diseases processes including hypersensitivity reactions (allergies, asthma), autoimmunity and cancer. We believe that the BIS can be a useful addition to the growing suite of in

  10. Multiobjective Decision Making Policies and Coordination Mechanisms in Hierarchical Organizations: Results of an Agent-Based Simulation

    PubMed Central

    2014-01-01

    This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926

  11. Multiobjective decision making policies and coordination mechanisms in hierarchical organizations: results of an agent-based simulation.

    PubMed

    Leitner, Stephan; Wall, Friederike

    2014-01-01

    This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926

  12. Modeling the 2014 Ebola Virus Epidemic - Agent-Based Simulations, Temporal Analysis and Future Predictions for Liberia and Sierra Leone.

    PubMed

    Siettos, Constantinos; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2015-01-01

    We developed an agent-based model to investigate the epidemic dynamics of Ebola virus disease (EVD) in Liberia and Sierra Leone from May 27 to December 21, 2014. The dynamics of the agent-based simulator evolve on small-world transmission networks of sizes equal to the population of each country, with adjustable densities to account for the effects of public health intervention policies and individual behavioral responses to the evolving epidemic. Based on time series of the official case counts from the World Health Organization (WHO), we provide estimates for key epidemiological variables by employing the so-called Equation-Free approach. The underlying transmission networks were characterized by rather random structures in the two countries with densities decreasing by ~19% from the early (May 27-early August) to the last period (mid October-December 21). Our estimates for the values of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate, are very close to the ones reported by the WHO Ebola response team during the early period of the epidemic (until September 14) that were calculated based on clinical data. Specifically, regarding the effective reproductive number Re, our analysis suggests that until mid October, Re was above 2.3 in both countries; from mid October to December 21, Re dropped well below unity in Liberia, indicating a saturation of the epidemic, while in Sierra Leone it was around 1.9, indicating an ongoing epidemic. Accordingly, a ten-week projection from December 21 estimated that the epidemic will fade out in Liberia in early March; in contrast, our results flashed a note of caution for Sierra Leone since the cumulative number of cases could reach as high as 18,000, and the number of deaths might exceed 5,000, by early March 2015. However, by processing the reported data of the very last period (December 21, 2014-January 18, 2015), we obtained more optimistic estimates indicative of a remission of

  13. Analysis of CDC social control measures using an agent-based simulation of an influenza epidemic in a city

    PubMed Central

    2011-01-01

    Background The transmission of infectious disease amongst the human population is a complex process which requires advanced, often individual-based, models to capture the space-time details observed in reality. Methods An Individual Space-Time Activity-based Model (ISTAM) was applied to simulate the effectiveness of non-pharmaceutical control measures including: (1) refraining from social activities, (2) school closure and (3) household quarantine, for a hypothetical influenza outbreak in an urban area. Results Amongst the set of control measures tested, refraining from social activities with various compliance levels was relatively ineffective. Household quarantine was very effective, especially for the peak number of cases and total number of cases, with large differences between compliance levels. Household quarantine resulted in a decrease in the peak number of cases from more than 300 to around 158 for a 100% compliance level, a decrease of about 48.7%. The delay in the outbreak peak was about 3 to 17 days. The total number of cases decreased to a range of 3635-5403, that is, 63.7%-94.7% of the baseline value. When coupling control measures, household quarantine together with school closure was the most effective strategy. The resulting space-time distribution of infection in different classes of activity bundles (AB) suggests that the epidemic outbreak is strengthened amongst children and then spread to adults. By sensitivity analysis, this study demonstrated that earlier implementation of control measures leads to greater efficacy. Also, for infectious diseases with larger basic reproduction number, the effectiveness of non-pharmaceutical measures was shown to be limited. Conclusions Simulated results showed that household quarantine was the most effective control measure, while school closure and household quarantine implemented together achieved the greatest benefit. Agent-based models should be applied in the future to evaluate the efficacy of control

  14. Agent Based Modeling Applications for Geosciences

    NASA Astrophysics Data System (ADS)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  15. A Scaffolding Framework to Support Learning of Emergent Phenomena Using Multi-Agent-Based Simulation Environments

    ERIC Educational Resources Information Center

    Basu, Satabdi; Sengupta, Pratim; Biswas, Gautam

    2015-01-01

    Students from middle school to college have difficulties in interpreting and understanding complex systems such as ecological phenomena. Researchers have suggested that students experience difficulties in reconciling the relationships between individuals, populations, and species, as well as the interactions between organisms and their environment…

  16. An Economic Analysis of Strategies to Control Clostridium Difficile Transmission and Infection Using an Agent-Based Simulation Model

    PubMed Central

    Nelson, Richard E.; Jones, Makoto; Leecaster, Molly; Samore, Matthew H.; Ray, William; Huttner, Angela; Huttner, Benedikt; Khader, Karim; Stevens, Vanessa W.; Gerding, Dale; Schweizer, Marin L.; Rubin, Michael A.

    2016-01-01

    Background A number of strategies exist to reduce Clostridium difficile (C. difficile) transmission. We conducted an economic evaluation of “bundling” these strategies together. Methods We constructed an agent-based computer simulation of nosocomial C. difficile transmission and infection in a hospital setting. This model included the following components: interactions between patients and health care workers; room contamination via C. difficile shedding; C. difficile hand carriage and removal via hand hygiene; patient acquisition of C. difficile via contact with contaminated rooms or health care workers; and patient antimicrobial use. Six interventions were introduced alone and "bundled" together: (a) aggressive C. difficile testing; (b) empiric isolation and treatment of symptomatic patients; (c) improved adherence to hand hygiene and (d) contact precautions; (e) improved use of soap and water for hand hygiene; and (f) improved environmental cleaning. Our analysis compared these interventions using values representing 3 different scenarios: (1) base-case (BASE) values that reflect typical hospital practice, (2) intervention (INT) values that represent implementation of hospital-wide efforts to reduce C. diff transmission, and (3) optimal (OPT) values representing the highest expected results from strong adherence to the interventions. Cost parameters for each intervention were obtained from published literature. We performed our analyses assuming low, normal, and high C. difficile importation prevalence and transmissibility of C. difficile. Results INT levels of the “bundled” intervention were cost-effective at a willingness-to-pay threshold of $100,000/quality-adjusted life-year in all importation prevalence and transmissibility scenarios. OPT levels of intervention were cost-effective for normal and high importation prevalence and transmissibility scenarios. When analyzed separately, hand hygiene compliance, environmental decontamination, and empiric

  17. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  18. Modeling the transmission of community-associated methicillin-resistant Staphylococcus aureus: a dynamic agent-based simulation

    PubMed Central

    2014-01-01

    Background Methicillin-resistant Staphylococcus aureus (MRSA) has been a deadly pathogen in healthcare settings since the 1960s, but MRSA epidemiology changed since 1990 with new genetically distinct strain types circulating among previously healthy people outside healthcare settings. Community-associated (CA) MRSA strains primarily cause skin and soft tissue infections, but may also cause life-threatening invasive infections. First seen in Australia and the U.S., it is a growing problem around the world. The U.S. has had the most widespread CA-MRSA epidemic, with strain type USA300 causing the great majority of infections. Individuals with either asymptomatic colonization or infection may transmit CA-MRSA to others, largely by skin-to-skin contact. Control measures have focused on hospital transmission. Limited public health education has focused on care for skin infections. Methods We developed a fine-grained agent-based model for Chicago to identify where to target interventions to reduce CA-MRSA transmission. An agent-based model allows us to represent heterogeneity in population behavior, locations and contact patterns that are highly relevant for CA-MRSA transmission and control. Drawing on nationally representative survey data, the model represents variation in sociodemographics, locations, behaviors, and physical contact patterns. Transmission probabilities are based on a comprehensive literature review. Results Over multiple 10-year runs with one-hour ticks, our model generates temporal and geographic trends in CA-MRSA incidence similar to Chicago from 2001 to 2010. On average, a majority of transmission events occurred in households, and colonized rather than infected agents were the source of the great majority (over 95%) of transmission events. The key findings are that infected people are not the primary source of spread. Rather, the far greater number of colonized individuals must be targeted to reduce transmission. Conclusions Our findings suggest

  19. Unified Simulation and Analysis Framework for Deep Space Navigation Design

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan; Chuang, Jason; Olsen, Carrie

    2013-01-01

    As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.

  20. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    SciTech Connect

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  1. Ideal free distribution or dynamic game? An agent-based simulation study of trawling strategies with varying information

    NASA Astrophysics Data System (ADS)

    Beecham, J. A.; Engelhard, G. H.

    2007-10-01

    An ecological economic model of trawling is presented to demonstrate the effect of trawling location choice strategy on net input (rate of economic gain of fish caught per time spent less costs). Fishing location choice is considered to be a dynamic process whereby trawlers chose from among a repertoire of plastic strategies that they modify if their gains fall below a fixed proportion of the mean gains of the fleet as a whole. The distribution of fishing across different areas of a fishery follows an approximate ideal free distribution (IFD) with varying noise due to uncertainty. The least-productive areas are not utilised because initial net input never reaches the mean yield of better areas subject to competitive exploitation. In cases, where there is a weak temporal autocorrelation between fish stocks in a specific location, a plastic strategy of local translocation between trawls mixed with longer-range translocation increases realised input. The trawler can change its translocation strategy in the light of information about recent trawling success compared to its long-term average but, in contrast to predictions of the Marginal Value Theorem (MVT) model, does not know for certain what it will find by moving, so may need to sample new patches. The combination of the two types of translocation mirrored beam-trawling strategies used by the Dutch fleet and the resultant distribution of trawling effort is confirmed by analysis of historical effort distribution of British otter trawling fleets in the North Sea. Fisheries exploitation represents an area where dynamic agent-based adaptive models may be a better representation of the economic dynamics of a fleet than classically inspired optimisation models.

  2. Integrating the simulation of domestic water demand behaviour to an urban water model using agent based modelling

    NASA Astrophysics Data System (ADS)

    Koutiva, Ifigeneia; Makropoulos, Christos

    2015-04-01

    The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model

  3. Agent-based models in translational systems biology

    PubMed Central

    An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram

    2013-01-01

    Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989

  4. Flexible Residential Smart Grid Simulation Framework

    NASA Astrophysics Data System (ADS)

    Xiang, Wang

    Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.

  5. Use of an agent-based simulation model to evaluate a mobile-based system for supporting emergency evacuation decision making.

    PubMed

    Tian, Yu; Zhou, Tian-Shu; Yao, Qin; Zhang, Mao; Li, Jing-Song

    2014-12-01

    Recently, mass casualty incidents (MCIs) have been occurring frequently and have gained international attention. There is an urgent need for scientifically proven and effective emergency responses to MCIs, particularly as the severity of incidents is continuously increasing. The emergency response to MCIs is a multi-dimensional and multi-participant dynamic process that changes in real-time. The evacuation decisions that assign casualties to different hospitals in a region are very important and impact both the results of emergency treatment and the efficiency of medical resource utilization. Previously, decisions related to casualty evacuation were made by an incident commander with emergency experience and in accordance with macro emergency guidelines. There are few decision-supporting tools available to reduce the difficulty and psychological pressure associated with the evacuation decisions an incident commander must make. In this study, we have designed a mobile-based system to collect medical and temporal data produced during an emergency response to an MCI. Using this information, our system's decision-making model can provide personal evacuation suggestions that improve the overall outcome of an emergency response. The effectiveness of our system in reducing overall mortality has been validated by an agent-based simulation model established to simulate an emergency response to an MCI. PMID:25354665

  6. Framework for Network Co-Simulation

    SciTech Connect

    Daily, Jeff; Ciraci, PNNL Selim; Fuller, PNNL Jason; Marinovici, PNNL Laurentiu; Fisher, PNNL Andrew; Lo, PNNL Chaomei; Hauer, PNNL Matthew

    2014-01-09

    The Framework for Network Co-Simulation (FNCS) uses a federated approach to integrate simulations which may have differing time scales. Special consideration is given to integration with a communication network simulation such that inter-simulation messages may be optionally routed through and delayed by such a simulation. In addition, FNCS uses novel time synchronization algorithms to accelerate co-simulation including the application of speculative multithreading. FNCS accomplishes all of these improvements with minimal end user intervention. Simulations can be integrated using FNCS while maintaining their original model input files simply by linking with the FNCS library and making appropriate calls into the FNCS API.

  7. A Software Framework for Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    2008-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.

  8. Modeling the 2014 Ebola Virus Epidemic – Agent-Based Simulations, Temporal Analysis and Future Predictions for Liberia and Sierra Leone

    PubMed Central

    Siettos, Constantinos; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2015-01-01

    We developed an agent-based model to investigate the epidemic dynamics of Ebola virus disease (EVD) in Liberia and Sierra Leone from May 27 to December 21, 2014. The dynamics of the agent-based simulator evolve on small-world transmission networks of sizes equal to the population of each country, with adjustable densities to account for the effects of public health intervention policies and individual behavioral responses to the evolving epidemic. Based on time series of the official case counts from the World Health Organization (WHO), we provide estimates for key epidemiological variables by employing the so-called Equation-Free approach. The underlying transmission networks were characterized by rather random structures in the two countries with densities decreasing by ~19% from the early (May 27-early August) to the last period (mid October-December 21). Our estimates for the values of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate, are very close to the ones reported by the WHO Ebola response team during the early period of the epidemic (until September 14) that were calculated based on clinical data. Specifically, regarding the effective reproductive number Re, our analysis suggests that until mid October, Re was above 2.3 in both countries; from mid October to December 21, Re dropped well below unity in Liberia, indicating a saturation of the epidemic, while in Sierra Leone it was around 1.9, indicating an ongoing epidemic. Accordingly, a ten-week projection from December 21 estimated that the epidemic will fade out in Liberia in early March; in contrast, our results flashed a note of caution for Sierra Leone since the cumulative number of cases could reach as high as 18,000, and the number of deaths might exceed 5,000, by early March 2015. However, by processing the reported data of the very last period (December 21, 2014-January 18, 2015), we obtained more optimistic estimates indicative of a remission of

  9. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  10. Framework for Network Co-Simulation

    2014-01-09

    The Framework for Network Co-Simulation (FNCS) uses a federated approach to integrate simulations which may have differing time scales. Special consideration is given to integration with a communication network simulation such that inter-simulation messages may be optionally routed through and delayed by such a simulation. In addition, FNCS uses novel time synchronization algorithms to accelerate co-simulation including the application of speculative multithreading. FNCS accomplishes all of these improvements with minimal end user intervention. Simulations canmore » be integrated using FNCS while maintaining their original model input files simply by linking with the FNCS library and making appropriate calls into the FNCS API.« less

  11. Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology

    NASA Astrophysics Data System (ADS)

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-11-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  12. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  13. Pattern-oriented modeling of agent-based complex systems: lessons from ecology.

    PubMed

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M; Railsback, Steven F; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L

    2005-11-11

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity. PMID:16284171

  14. Investigating biocomplexity through the agent-based paradigm

    PubMed Central

    Kaul, Himanshu

    2015-01-01

    Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines—or agents—to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex. PMID:24227161

  15. Onyx-Advanced Aeropropulsion Simulation Framework Created

    NASA Technical Reports Server (NTRS)

    Reed, John A.

    2001-01-01

    The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.

  16. An Active Learning Exercise for Introducing Agent-Based Modeling

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  17. The Umbra Simulation and Integration Framework Applied to Emergency Response Training

    NASA Technical Reports Server (NTRS)

    Hamilton, Paul Lawrence; Britain, Robert

    2010-01-01

    The Mine Emergency Response Interactive Training Simulation (MERITS) is intended to prepare personnel to manage an emergency in an underground coal mine. The creation of an effective training environment required realistic emergent behavior in response to simulation events and trainee interventions, exploratory modification of miner behavior rules, realistic physics, and incorporation of legacy code. It also required the ability to add rich media to the simulation without conflicting with normal desktop security settings. Our Umbra Simulation and Integration Framework facilitated agent-based modeling of miners and rescuers and made it possible to work with subject matter experts to quickly adjust behavior through script editing, rather than through lengthy programming and recompilation. Integration of Umbra code with the WebKit browser engine allowed the use of JavaScript-enabled local web pages for media support. This project greatly extended the capabilities of Umbra in support of training simulations and has implications for simulations that combine human behavior, physics, and rich media.

  18. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  19. Monte Carlo simulation framework for TMT

    NASA Astrophysics Data System (ADS)

    Vogiatzis, Konstantinos; Angeli, George Z.

    2008-07-01

    This presentation describes a strategy for assessing the performance of the Thirty Meter Telescope (TMT). A Monte Carlo Simulation Framework has been developed to combine optical modeling with Computational Fluid Dynamics simulations (CFD), Finite Element Analysis (FEA) and controls to model the overall performance of TMT. The framework consists of a two year record of observed environmental parameters such as atmospheric seeing, site wind speed and direction, ambient temperature and local sunset and sunrise times, along with telescope azimuth and elevation with a given sampling rate. The modeled optical, dynamic and thermal seeing aberrations are available in a matrix form for distinct values within the range of influencing parameters. These parameters are either part of the framework parameter set or can be derived from them at each time-step. As time advances, the aberrations are interpolated and combined based on the current value of their parameters. Different scenarios can be generated based on operating parameters such as venting strategy, optical calibration frequency and heat source control. Performance probability distributions are obtained and provide design guidance. The sensitivity of the system to design, operating and environmental parameters can be assessed in order to maximize the % of time the system meets the performance specifications.

  20. Simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Tentner, A.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.

  1. An Agent-Based Cockpit Task Management System

    NASA Technical Reports Server (NTRS)

    Funk, Ken

    1997-01-01

    An agent-based program to facilitate Cockpit Task Management (CTM) in commercial transport aircraft is developed and evaluated. The agent-based program called the AgendaManager (AMgr) is described and evaluated in a part-task simulator study using airline pilots.

  2. Agent-Based Modeling in Systems Pharmacology.

    PubMed

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling. PMID:26783498

  3. Architecting a Simulation Framework for Model Rehosting

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2004-01-01

    The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.

  4. Argonne simulation framework for intelligent transportation systems

    SciTech Connect

    Ewing, T.; Doss, E.; Hanebutte, U.; Canfield, T.; Brown-VanHoozer, A.; Tentner, A.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically to reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  5. Molecular Dynamics Simulations of Graphene Oxide Frameworks

    SciTech Connect

    Zhu, Pan; Sumpter, Bobby G; Meunier, V.; Nicolai, Adrien

    2013-01-01

    We use quantum mechanical calculations to develop a full set of force field parameters in order to perform molecular dynamics simulations to understand and optimize the molecular storage properties inside Graphene Oxide Frameworks (GOFs). A set of boron-related parameters for commonly used empirical force fields is determined to describe the non-bonded and bonded interactions between linear boronic acid linkers and graphene sheets of GOF materials. The transferability of the parameters is discussed and their validity is quantified by comparing quantum mechanical and molecular mechanical structural and vibrational properties. The application of the model to the dynamics of water inside the GOFs reveals significant variations in structural flexibility of GOF depending on the linker density, which is shown to be usable as a tuning parameter for desired diffusion properties.

  6. Agent-Based Literacy Theory

    ERIC Educational Resources Information Center

    McEneaney, John E.

    2006-01-01

    The purpose of this theoretical essay is to explore the limits of traditional conceptualizations of reader and text and to propose a more general theory based on the concept of a literacy agent. The proposed theoretical perspective subsumes concepts from traditional theory and aims to account for literacy online. The agent-based literacy theory…

  7. MCdevelop - a universal framework for Stochastic Simulations

    NASA Astrophysics Data System (ADS)

    Slawinska, M.; Jadach, S.

    2011-03-01

    We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http

  8. Agent-based forward analysis

    SciTech Connect

    Kerekes, Ryan A.; Jiao, Yu; Shankar, Mallikarjun; Potok, Thomas E.; Lusk, Rick M.

    2008-01-01

    We propose software agent-based "forward analysis" for efficient information retrieval in a network of sensing devices. In our approach, processing is pushed to the data at the edge of the network via intelligent software agents rather than pulling data to a central facility for processing. The agents are deployed with a specific query and perform varying levels of analysis of the data, communicating with each other and sending only relevant information back across the network. We demonstrate our concept in the context of face recognition using a wireless test bed comprised of PDA cell phones and laptops. We show that agent-based forward analysis can provide a significant increase in retrieval speed while decreasing bandwidth usage and information overload at the central facility. n

  9. A simulation framework for mapping risks in clinical processes: the case of in-patient transfers

    PubMed Central

    Ong, Mei-Sing; Westbrook, Johanna I; Magrabi, Farah; Coiera, Enrico; Wobcke, Wayne

    2011-01-01

    Objective To model how individual violations in routine clinical processes cumulatively contribute to the risk of adverse events in hospital using an agent-based simulation framework. Design An agent-based simulation was designed to model the cascade of common violations that contribute to the risk of adverse events in routine clinical processes. Clinicians and the information systems that support them were represented as a group of interacting agents using data from direct observations. The model was calibrated using data from 101 patient transfers observed in a hospital and results were validated for one of two scenarios (a misidentification scenario and an infection control scenario). Repeated simulations using the calibrated model were undertaken to create a distribution of possible process outcomes. The likelihood of end-of-chain risk is the main outcome measure, reported for each of the two scenarios. Results The simulations demonstrate end-of-chain risks of 8% and 24% for the misidentification and infection control scenarios, respectively. Over 95% of the simulations in both scenarios are unique, indicating that the in-patient transfer process diverges from prescribed work practices in a variety of ways. Conclusions The simulation allowed us to model the risk of adverse events in a clinical process, by generating the variety of possible work subject to violations, a novel prospective risk analysis method. The in-patient transfer process has a high proportion of unique trajectories, implying that risk mitigation may benefit from focusing on reducing complexity rather than augmenting the process with further rule-based protocols. PMID:21486883

  10. Exploring cooperation and competition using agent-based modeling

    PubMed Central

    Elliott, Euel; Kiel, L. Douglas

    2002-01-01

    Agent-based modeling enhances our capacity to model competitive and cooperative behaviors at both the individual and group levels of analysis. Models presented in these proceedings produce consistent results regarding the relative fragility of cooperative regimes among agents operating under diverse rules. These studies also show how competition and cooperation may generate change at both the group and societal level. Agent-based simulation of competitive and cooperative behaviors may reveal the greatest payoff to social science research of all agent-based modeling efforts because of the need to better understand the dynamics of these behaviors in an increasingly interconnected world. PMID:12011396

  11. Simulation framework for spatio-spectral anomalous change detection

    SciTech Connect

    Theiler, James P; Harvey, Neal R; Porter, Reid B; Wohlberg, Brendt E

    2009-01-01

    The authors describe the development of a simulation framework for anomalous change detection that considers both the spatial and spectral aspects of the imagery. A purely spectral framework has previously been introduced, but the extension to spatio-spectral requires attention to a variety of new issues, and requires more careful modeling of the anomalous changes. Using this extended framework, they evaluate the utility of spatial image processing operators to enhance change detection sensitivity in (simulated) remote sensing imagery.

  12. GIS and agent based spatial-temporal simulation modeling for assessing tourism social carrying capacity: a study on Mount Emei scenic area, China

    NASA Astrophysics Data System (ADS)

    Zhang, Renjun

    2007-06-01

    Each scenic area can sustain a specific level of acceptance of tourist development and use, beyond which further development can result in socio-cultural deterioration or a decline in the quality of the experience gained by visitors. This specific level is called carrying capacity. Social carrying capacity can be defined as the maximum level of use (in terms of numbers and activities) that can be absorbed by an area without an unacceptable decline in the quality of experience of visitors and without an unacceptable adverse impact on the society of the area. It is difficult to assess the carrying capacity, because the carrying capacity is determined by not only the number of visitors, but also the time, the type of the recreation, the characters of each individual and the physical environment. The objective of this study is to build a spatial-temporal simulation model to simulate the spatial-temporal distribution of tourists. This model is a tourist spatial behaviors simulator (TSBS). Based on TSBS, the changes of each visitor's travel patterns such as location, cost, and other states data are recoded in a state table. By analyzing this table, the intensity of the tourist use in any area can be calculated; the changes of the quality of tourism experience can be quantized and analyzed. So based on this micro simulation method the social carrying capacity can be assessed more accurately, can be monitored proactively and managed adaptively. In this paper, the carrying capacity of Mount Emei scenic area is analyzed as followed: The author selected the intensity of the crowd as the monitoring Indicators. it is regarded that longer waiting time means more crowded. TSBS was used to simulate the spatial-temporal distribution of tourists. the average of waiting time all the visitors is calculated. And then the author assessed the social carrying capacity of Mount Emei scenic area, found the key factors have impacted on social carrying capacity. The results show that the TSBS

  13. Semantic Extension of Agent-Based Control: The Packing Cell Case Study

    NASA Astrophysics Data System (ADS)

    Vrba, Pavel; Radakovič, Miloslav; Obitko, Marek; Mařík, Vladimír

    The paper reports on the latest R&D activities in the field of agent-based manufacturing control systems. It is documented that this area becomes strongly influenced by the advancements of semantic technologies like the Web Ontology Language. The application of ontologies provides the agents with much more effective means for handling, exchanging and reasoning about the knowledge. The ontology dedicated for semantic description of orders, production processes and material handling tasks in discrete manufacturing domain has been developed. In addition, the framework for integration of this ontology in distributed, agent-based control solutions is given. The Manufacturing Agent Simulation Tool (MAST) is used as a base for pilot implementation of the ontology-powered multiagent control system; the packing cell environment is selected as a case study.

  14. Agent Based Modeling as an Educational Tool

    NASA Astrophysics Data System (ADS)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  15. Dshell++: A Component Based, Reusable Space System Simulation Framework

    NASA Technical Reports Server (NTRS)

    Lim, Christopher S.; Jain, Abhinandan

    2009-01-01

    This paper describes the multi-mission Dshell++ simulation framework for high fidelity, physics-based simulation of spacecraft, robotic manipulation and mobility systems. Dshell++ is a C++/Python library which uses modern script driven object-oriented techniques to allow component reuse and a dynamic run-time interface for complex, high-fidelity simulation of spacecraft and robotic systems. The goal of the Dshell++ architecture is to manage the inherent complexity of physicsbased simulations while supporting component model reuse across missions. The framework provides several features that support a large degree of simulation configurability and usability.

  16. Adding ecosystem function to agent-based land use models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this paper is to examine issues in the inclusion of simulations of ecosystem functions in agent-based models of land use decision-making. The reasons for incorporating these simulations include local interests in land fertility and global interests in carbon sequestration. Biogeoche...

  17. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data

    PubMed Central

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589

  18. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data.

    PubMed

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589

  19. A Multiscale/Multifidelity CFD Framework for Robust Simulations

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Yannis; Karniadakis, George

    2015-11-01

    We develop a general CFD framework based on multifidelity simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy simulated fields. We combine approximation theory and domain decomposition together with machine learning techniques, e.g. co-Kriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation with different patches of the domain simulated by finite differences at fine resolution or very low resolution but also with Monte Carlo, hence fusing multifidelity and heterogeneous models to obtain the final answer. Second, we simulate the flow in a driven cavity by fusing finite difference solutions with solutions obtained by dissipative particle dynamics - a coarse-grained molecular dynamics method. In addition to its robustness and resilience, the new framework generalizes previous multiscale approaches (e.g. continuum-atomistic) in a unified parallel computational framework.

  20. A Simulation and Modeling Framework for Space Situational Awareness

    SciTech Connect

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  1. OpenMM: A Hardware Independent Framework for Molecular Simulations

    PubMed Central

    Eastman, Peter; Pande, Vijay S.

    2015-01-01

    The wide diversity of computer architectures today requires a new approach to software development. OpenMM is a framework for molecular mechanics simulations, allowing a single program to run efficiently on a variety of hardware platforms. PMID:26146490

  2. The Astrophysics Simulation Collaboratory portal: A framework foreffective distributed research

    SciTech Connect

    Bondarescu, Ruxandra; Allen, Gabrielle; Daues, Gregory; Kelly,Ian; Russell, Michael; Seidel, Edward; Shalf, John; Tobias, Malcolm

    2003-03-03

    We describe the motivation, architecture, and implementation of the Astrophysics Simulation Collaboratory (ASC) portal. The ASC project provides a web-based problem solving framework for the astrophysics community that harnesses the capabilities of emerging computational grids.

  3. FACET: A simulation software framework for modeling complex societal processes and interactions

    SciTech Connect

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  4. Agent-based power sharing scheme for active hybrid power sources

    NASA Astrophysics Data System (ADS)

    Jiang, Zhenhua

    The active hybridization technique provides an effective approach to combining the best properties of a heterogeneous set of power sources to achieve higher energy density, power density and fuel efficiency. Active hybrid power sources can be used to power hybrid electric vehicles with selected combinations of internal combustion engines, fuel cells, batteries, and/or supercapacitors. They can be deployed in all-electric ships to build a distributed electric power system. They can also be used in a bulk power system to construct an autonomous distributed energy system. An important aspect in designing an active hybrid power source is to find a suitable control strategy that can manage the active power sharing and take advantage of the inherent scalability and robustness benefits of the hybrid system. This paper presents an agent-based power sharing scheme for active hybrid power sources. To demonstrate the effectiveness of the proposed agent-based power sharing scheme, simulation studies are performed for a hybrid power source that can be used in a solar car as the main propulsion power module. Simulation results clearly indicate that the agent-based control framework is effective to coordinate the various energy sources and manage the power/voltage profiles.

  5. Linking MODFLOW with an agent-based land-use model to support decision making.

    PubMed

    Reeves, Howard W; Zellner, Moira L

    2010-01-01

    The U.S. Geological Survey numerical groundwater flow model, MODFLOW, was integrated with an agent-based land-use model to yield a simulator for environmental planning studies. Ultimately, this integrated simulator will be used as a means to organize information, illustrate potential system responses, and facilitate communication within a participatory modeling framework. Initial results show the potential system response to different zoning policy scenarios in terms of the spatial patterns of development, which is referred to as urban form, and consequent impacts on groundwater levels. These results illustrate how the integrated simulator is capable of representing the complexity of the system. From a groundwater modeling perspective, the most important aspect of the integration is that the simulator generates stresses on the groundwater system within the simulation in contrast to the traditional approach that requires the user to specify the stresses through time. PMID:20132323

  6. Linking MODFLOW with an agent-based land-use model to support decision making

    USGS Publications Warehouse

    Reeves, H.W.; Zellner, M.L.

    2010-01-01

    The U.S. Geological Survey numerical groundwater flow model, MODFLOW, was integrated with an agent-based land-use model to yield a simulator for environmental planning studies. Ultimately, this integrated simulator will be used as a means to organize information, illustrate potential system responses, and facilitate communication within a participatory modeling framework. Initial results show the potential system response to different zoning policy scenarios in terms of the spatial patterns of development, which is referred to as urban form, and consequent impacts on groundwater levels. These results illustrate how the integrated simulator is capable of representing the complexity of the system. From a groundwater modeling perspective, the most important aspect of the integration is that the simulator generates stresses on the groundwater system within the simulation in contrast to the traditional approach that requires the user to specify the stresses through time. Copyright ?? 2010 The Author(s). Journal compilation ?? 2010 National Ground Water Association.

  7. Games and Simulations in Online Learning: Research and Development Frameworks

    ERIC Educational Resources Information Center

    Gibson, David; Aldrich, Clark; Prensky, Marc

    2007-01-01

    Games and Simulations in Online Learning: Research and Development Frameworks examines the potential of games and simulations in online learning, and how the future could look as developers learn to use the emerging capabilities of the Semantic Web. It presents a general understanding of how the Semantic Web will impact education and how games and…

  8. Simulation Framework for Teaching in Modeling and Simulation Areas

    ERIC Educational Resources Information Center

    De Giusti, Marisa Raquel; Lira, Ariel Jorge; Villarreal, Gonzalo Lujan

    2008-01-01

    Simulation is the process of executing a model that describes a system with enough detail; this model has its entities, an internal state, some input and output variables and a list of processes bound to these variables. Teaching a simulation language such as general purpose simulation system (GPSS) is always a challenge, because of the way it…

  9. FDPS: Framework for Developing Particle Simulators

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-04-01

    FDPS provides the necessary functions for efficient parallel execution of particle-based simulations as templates independent of the data structure of particles and the functional form of the interaction. It is used to develop particle-based simulation programs for large-scale distributed-memory parallel supercomputers. FDPS includes templates for domain decomposition, redistribution of particles, and gathering of particle information for interaction calculation. It uses algorithms such as Barnes-Hut tree method for long-range interactions; methods to limit the calculation to neighbor particles are used for short-range interactions. FDPS reduces the time and effort necessary to write a simple, sequential and unoptimized program of O(N^2) calculation cost, and produces compiled programs that will run efficiently on large-scale parallel supercomputers.

  10. Software Framework for Advanced Power Plant Simulations

    SciTech Connect

    John Widmann; Sorin Munteanu; Aseem Jain; Pankaj Gupta; Mark Moales; Erik Ferguson; Lewis Collins; David Sloan; Woodrow Fiveland; Yi-dong Lang; Larry Biegler; Michael Locke; Simon Lingard; Jay Yun

    2010-08-01

    This report summarizes the work accomplished during the Phase II development effort of the Advanced Process Engineering Co-Simulator (APECS). The objective of the project is to develop the tools to efficiently combine high-fidelity computational fluid dynamics (CFD) models with process modeling software. During the course of the project, a robust integration controller was developed that can be used in any CAPE-OPEN compliant process modeling environment. The controller mediates the exchange of information between the process modeling software and the CFD software. Several approaches to reducing the time disparity between CFD simulations and process modeling have been investigated and implemented. These include enabling the CFD models to be run on a remote cluster and enabling multiple CFD models to be run simultaneously. Furthermore, computationally fast reduced-order models (ROMs) have been developed that can be 'trained' using the results from CFD simulations and then used directly within flowsheets. Unit operation models (both CFD and ROMs) can be uploaded to a model database and shared between multiple users.

  11. Introduction of a Framework for Dynamic Knowledge Representation of the Control Structure of Transplant Immunology: Employing the Power of Abstraction with a Solid Organ Transplant Agent-Based Model

    PubMed Central

    An, Gary

    2015-01-01

    Agent-based modeling has been used to characterize the nested control loops and non-linear dynamics associated with inflammatory and immune responses, particularly as a means of visualizing putative mechanistic hypotheses. This process is termed dynamic knowledge representation and serves a critical role in facilitating the ability to test and potentially falsify hypotheses in the current data- and hypothesis-rich biomedical research environment. Importantly, dynamic computational modeling aids in identifying useful abstractions, a fundamental scientific principle that pervades the physical sciences. Recognizing the critical scientific role of abstraction provides an intellectual and methodological counterweight to the tendency in biology to emphasize comprehensive description as the primary manifestation of biological knowledge. Transplant immunology represents yet another example of the challenge of identifying sufficient understanding of the inflammatory/immune response in order to develop and refine clinically effective interventions. Advances in immunosuppressive therapies have greatly improved solid organ transplant (SOT) outcomes, most notably by reducing and treating acute rejection. The end goal of these transplant immune strategies is to facilitate effective control of the balance between regulatory T cells and the effector/cytotoxic T-cell populations in order to generate, and ideally maintain, a tolerant phenotype. Characterizing the dynamics of immune cell populations and the interactive feedback loops that lead to graft rejection or tolerance is extremely challenging, but is necessary if rational modulation to induce transplant tolerance is to be accomplished. Herein is presented the solid organ agent-based model (SOTABM) as an initial example of an agent-based model (ABM) that abstractly reproduces the cellular and molecular components of the immune response to SOT. Despite its abstract nature, the SOTABM is able to qualitatively reproduce acute

  12. A Simulation Framework for Exploring Socioecological Dynamics and Sustainability of Settlement Systems Under Stress in Ancient Mesopotamia and Beyond

    NASA Astrophysics Data System (ADS)

    Christiansen, J. H.; Altaweel, M. R.

    2007-12-01

    The presentation will describe an object-oriented, agent-based simulation framework being used to help answer longstanding questions regarding the development trajectories and sustainability of ancient Mesopotamian settlement systems. This multidisciplinary, multi-model framework supports explicit, fine-scale representations of the dynamics of key natural processes such as crop growth, hydrology, and weather, operating concurrently with social processes such as kinship-driven behaviors, farming and herding practices, social stratification, and economic and political activities carried out by social agents that represent individual persons, households, and larger-scale organizations. The framework has allowed us to explore the inherently coupled dynamics of modeled settlements and landscapes that are undergoing diverse social and environmental stresses, both acute and chronic, across multi-generational time spans. The simulation framework was originally used to address single-settlement scenarios, but has recently been extended to begin to address settlement system sustainability issues at sub-regional to regional scale, by introducing a number of new dynamic mechanisms, such as the activities of nomadic communities, that manifest themselves at these larger spatial scales. The framework is flexible and scalable and has broad applicability. It has, for example, recently been adapted to address agroeconomic sustainability of settlement systems in modern rural Thailand, testing the resilience and vulnerability of settled landscapes in the face of such perturbations as large-scale political interventions, global economic shifts, and climate change.

  13. Applications of Agent Based Approaches in Business (A Three Essay Dissertation)

    ERIC Educational Resources Information Center

    Prawesh, Shankar

    2013-01-01

    The goal of this dissertation is to investigate the enabling role that agent based simulation plays in business and policy. The aforementioned issue has been addressed in this dissertation through three distinct, but related essays. The first essay is a literature review of different research applications of agent based simulation in various…

  14. Component Framework for Loosely Coupled High Performance Integrated Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Elwasif, W. R.; Bernholdt, D. E.; Shet, A. G.; Batchelor, D. B.; Foley, S.

    2010-11-01

    We present the design and implementation of a component-based simulation framework for the execution of coupled time-dependent plasma modeling codes. The Integrated Plasma Simulator (IPS) provides a flexible lightweight component model that streamlines the integration of stand alone codes into coupled simulations. Standalone codes are adapted to the IPS component interface specification using a thin wrapping layer implemented in the Python programming language. The framework provides services for inter-component method invocation, configuration, task, and data management, asynchronous event management, simulation monitoring, and checkpoint/restart capabilities. Services are invoked, as needed, by the computational components to coordinate the execution of different aspects of coupled simulations on Massive parallel Processing (MPP) machines. A common plasma state layer serves as the foundation for inter-component, file-based data exchange. The IPS design principles, implementation details, and execution model will be presented, along with an overview of several use cases.

  15. A Simulation Framework for Virtual Prototyping of Robotic Exoskeletons.

    PubMed

    Agarwal, Priyanshu; Neptune, Richard R; Deshpande, Ashish D

    2016-06-01

    A number of robotic exoskeletons are being developed to provide rehabilitation interventions for those with movement disabilities. We present a systematic framework that allows for virtual prototyping (i.e., design, control, and experimentation (i.e. design, control, and experimentation) of robotic exoskeletons. The framework merges computational musculoskeletal analyses with simulation-based design techniques which allows for exoskeleton design and control algorithm optimization. We introduce biomechanical, morphological, and controller measures to optimize the exoskeleton performance. A major advantage of the framework is that it provides a platform for carrying out hypothesis-driven virtual experiments to quantify device performance and rehabilitation progress. To illustrate the efficacy of the framework, we present a case study wherein the design and analysis of an index finger exoskeleton is carried out using the proposed framework. PMID:27018453

  16. Framework for Architecture Trade Study Using MBSE and Performance Simulation

    NASA Technical Reports Server (NTRS)

    Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas

    2012-01-01

    Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.

  17. Next Generation Simulation Framework for Robotic and Human Space Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven

    2012-01-01

    The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.

  18. Adding ecosystem function to agent-based land use models

    PubMed Central

    Yadav, V.; Del Grosso, S.J.; Parton, W.J.; Malanson, G.P.

    2015-01-01

    The objective of this paper is to examine issues in the inclusion of simulations of ecosystem functions in agent-based models of land use decision-making. The reasons for incorporating these simulations include local interests in land fertility and global interests in carbon sequestration. Biogeochemical models are needed in order to calculate such fluxes. The Century model is described with particular attention to the land use choices that it can encompass. When Century is applied to a land use problem the combinatorial choices lead to a potentially unmanageable number of simulation runs. Century is also parameter-intensive. Three ways of including Century output in agent-based models, ranging from separately calculated look-up tables to agents running Century within the simulation, are presented. The latter may be most efficient, but it moves the computing costs to where they are most problematic. Concern for computing costs should not be a roadblock. PMID:26191077

  19. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are

  20. Introducing FNCS: Framework for Network Co-Simulation

    SciTech Connect

    2014-10-23

    This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.

  1. Particle Tracking and Simulation on the .NET Framework

    SciTech Connect

    Nishimura, Hiroshi; Scarvie, Tom

    2006-06-19

    Particle tracking and simulation studies are becoming increasingly complex. In addition to the use of more sophisticated graphics, interactive scripting is becoming popular. Compatibility with different control systems requires network and database capabilities. It is not a trivial task to fulfill all the various requirements without sacrificing runtime performance. We evaluated the effectiveness of the .NET framework by converting a C++ simulation code to C. The portability to other platforms is mentioned in terms of Mono.

  2. Agent-based model for the h-index - exact solution

    NASA Astrophysics Data System (ADS)

    Żogała-Siudem, Barbara; Siudem, Grzegorz; Cena, Anna; Gagolewski, Marek

    2016-01-01

    Hirsch's h-index is perhaps the most popular citation-based measure of scientific excellence. In 2013, Ionescu and Chopard proposed an agent-based model describing a process for generating publications and citations in an abstract scientific community [G. Ionescu, B. Chopard, Eur. Phys. J. B 86, 426 (2013)]. Within such a framework, one may simulate a scientist's activity, and - by extension - investigate the whole community of researchers. Even though the Ionescu and Chopard model predicts the h-index quite well, the authors provided a solution based solely on simulations. In this paper, we complete their results with exact, analytic formulas. What is more, by considering a simplified version of the Ionescu-Chopard model, we obtained a compact, easy to compute formula for the h-index. The derived approximate and exact solutions are investigated on a simulated and real-world data sets.

  3. Agent-based enterprise integration

    SciTech Connect

    N. M. Berry; C. M. Pancerella

    1998-12-01

    The authors are developing and deploying software agents in an enterprise information architecture such that the agents manage enterprise resources and facilitate user interaction with these resources. The enterprise agents are built on top of a robust software architecture for data exchange and tool integration across heterogeneous hardware and software. The resulting distributed multi-agent system serves as a method of enhancing enterprises in the following ways: providing users with knowledge about enterprise resources and applications; accessing the dynamically changing enterprise; locating enterprise applications and services; and improving search capabilities for applications and data. Furthermore, agents can access non-agents (i.e., databases and tools) through the enterprise framework. The ultimate target of the effort is the user; they are attempting to increase user productivity in the enterprise. This paper describes their design and early implementation and discusses the planned future work.

  4. Agent-based scheduling system to achieve agility

    NASA Astrophysics Data System (ADS)

    Akbulut, Muhtar B.; Kamarthi, Sagar V.

    2000-12-01

    Today's competitive enterprises need to design, develop, and manufacture their products rapidly and inexpensively. Agile manufacturing has emerged as a new paradigm to meet these challenges. Agility requires, among many other things, scheduling and control software systems that are flexible, robust, and adaptive. In this paper a new agent-based scheduling system (ABBS) is developed to meet the challenges of an agile manufacturing system. In ABSS, unlike in the traditional approaches, information and decision making capabilities are distributed among the system entities called agents. In contrast with the most agent-based scheduling systems which commonly use a bidding approach, the ABBS employs a global performance monitoring strategy. A production-rate-based global performance metric which effectively assesses the system performance is developed to assist the agents' decision making process. To test the architecture, an agent-based discrete event simulation software is developed. The experiments performed using the simulation software yielded encouraging results in supporting the applicability of agent-based systems to address the scheduling and control needs of an agile manufacturing system.

  5. An example-based brain MRI simulation framework

    NASA Astrophysics Data System (ADS)

    He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L.

    2015-03-01

    The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.

  6. Who's your neighbor? neighbor identification for agent-based modeling.

    SciTech Connect

    Macal, C. M.; Howe, T. R.; Decision and Information Sciences; Univ. of Chicago

    2006-01-01

    Agent-based modeling and simulation, based on the cellular automata paradigm, is an approach to modeling complex systems comprised of interacting autonomous agents. Open questions in agent-based simulation focus on scale-up issues encountered in simulating large numbers of agents. Specifically, how many agents can be included in a workable agent-based simulation? One of the basic tenets of agent-based modeling and simulation is that agents only interact and exchange locally available information with other agents located in their immediate proximity or neighborhood of the space in which the agents are situated. Generally, an agent's set of neighbors changes rapidly as a simulation proceeds through time and as the agents move through space. Depending on the topology defined for agent interactions, proximity may be defined by spatial distance for continuous space, adjacency for grid cells (as in cellular automata), or by connectivity in social networks. Identifying an agent's neighbors is a particularly time-consuming computational task and can dominate the computational effort in a simulation. Two challenges in agent simulation are (1) efficiently representing an agent's neighborhood and the neighbors in it and (2) efficiently identifying an agent's neighbors at any time in the simulation. These problems are addressed differently for different agent interaction topologies. While efficient approaches have been identified for agent neighborhood representation and neighbor identification for agents on a lattice with general neighborhood configurations, other techniques must be used when agents are able to move freely in space. Techniques for the analysis and representation of spatial data are applicable to the agent neighbor identification problem. This paper extends agent neighborhood simulation techniques from the lattice topology to continuous space, specifically R2. Algorithms based on hierarchical (quad trees) or non-hierarchical data structures (grid cells) are

  7. Modeling the Population Dynamics of Antibiotic-Resistant Bacteria:. AN Agent-Based Approach

    NASA Astrophysics Data System (ADS)

    Murphy, James T.; Walshe, Ray; Devocelle, Marc

    The response of bacterial populations to antibiotic treatment is often a function of a diverse range of interacting factors. In order to develop strategies to minimize the spread of antibiotic resistance in pathogenic bacteria, a sound theoretical understanding of the systems of interactions taking place within a colony must be developed. The agent-based approach to modeling bacterial populations is a useful tool for relating data obtained at the molecular and cellular level with the overall population dynamics. Here we demonstrate an agent-based model, called Micro-Gen, which has been developed to simulate the growth and development of bacterial colonies in culture. The model also incorporates biochemical rules and parameters describing the kinetic interactions of bacterial cells with antibiotic molecules. Simulations were carried out to replicate the development of methicillin-resistant S. aureus (MRSA) colonies growing in the presence of antibiotics. The model was explored to see how the properties of the system emerge from the interactions of the individual bacterial agents in order to achieve a better mechanistic understanding of the population dynamics taking place. Micro-Gen provides a good theoretical framework for investigating the effects of local environmental conditions and cellular properties on the response of bacterial populations to antibiotic exposure in the context of a simulated environment.

  8. An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures

    NASA Astrophysics Data System (ADS)

    Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.

    2009-07-01

    A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.

  9. An agent based model of genotype editing

    SciTech Connect

    Rocha, L. M.; Huang, C. F.

    2004-01-01

    This paper presents our investigation on an agent-based model of Genotype Editing. This model is based on several characteristics that are gleaned from the RNA editing system as observed in several organisms. The incorporation of editing mechanisms in an evolutionary agent-based model provides a means for evolving agents with heterogenous post-transcriptional processes. The study of this agent-based genotype-editing model has shed some light into the evolutionary implications of RNA editing as well as established an advantageous evolutionary computation algorithm for machine learning. We expect that our proposed model may both facilitate determining the evolutionary role of RNA editing in biology, and advance the current state of research in agent-based optimization.

  10. A Multi-Paradigm Modeling Framework to Simulate Dynamic Reciprocity in a Bioreactor

    PubMed Central

    Kaul, Himanshu; Cui, Zhanfeng; Ventikos, Yiannis

    2013-01-01

    Despite numerous technology advances, bioreactors are still mostly utilized as functional black-boxes where trial and error eventually leads to the desirable cellular outcome. Investigators have applied various computational approaches to understand the impact the internal dynamics of such devices has on overall cell growth, but such models cannot provide a comprehensive perspective regarding the system dynamics, due to limitations inherent to the underlying approaches. In this study, a novel multi-paradigm modeling platform capable of simulating the dynamic bidirectional relationship between cells and their microenvironment is presented. Designing the modeling platform entailed combining and coupling fully an agent-based modeling platform with a transport phenomena computational modeling framework. To demonstrate capability, the platform was used to study the impact of bioreactor parameters on the overall cell population behavior and vice versa. In order to achieve this, virtual bioreactors were constructed and seeded. The virtual cells, guided by a set of rules involving the simulated mass transport inside the bioreactor, as well as cell-related probabilistic parameters, were capable of displaying an array of behaviors such as proliferation, migration, chemotaxis and apoptosis. In this way the platform was shown to capture not only the impact of bioreactor transport processes on cellular behavior but also the influence that cellular activity wields on that very same local mass transport, thereby influencing overall cell growth. The platform was validated by simulating cellular chemotaxis in a virtual direct visualization chamber and comparing the simulation with its experimental analogue. The results presented in this paper are in agreement with published models of similar flavor. The modeling platform can be used as a concept selection tool to optimize bioreactor design specifications. PMID:23555740

  11. Effectiveness of dynamic rescheduling in agent-based flexible manufacturing systems

    NASA Astrophysics Data System (ADS)

    Saad, Ashraf; Biswas, Gautam; Kawamura, Kazuhiko; Johnson, Eric M.

    1997-12-01

    This work has been developed within the framework of agent- based decentralized scheduling for flexible manufacturing systems. In this framework, all workcells comprising the manufacturing system, and the products to be generated, are modeled via intelligent software agents. These agents interact dynamically using a bidding production reservation (BPRS) scheme, based on the Contract Net Protocol, to devise the production schedule for each product unit. Simulation studies of a job shop have demonstrated the gains in performance achieved by this approach over heuristic dispatching rules commonly used in industry. Manufacturing environments are also prone to operational uncertainties such as variations in processing times and machine breakdowns. In order to cope with these uncertainties, the BPRS algorithm has been extended for dynamic rescheduling to also occur in a fully decentralized manner. The resulting multi-agent rescheduling scheme results in decentralized control of flexible manufacturing systems that are capable of responding dynamically to such operational uncertainties, thereby enhancing the robustness and fault tolerance of the proposed scheduling approach. This paper also presents the effects of the proposed agent-based decentralized scheduling approach on the performance of the underlying flexible manufacturing system under a variety of production and scheduling scenarios, including forward and backward scheduling. Future directions for this work include applying the proposed scheduling approach to other advanced manufacturing areas such as agile and holonic manufacturing.

  12. In-situ Data Analysis Framework for ACME Land Simulations

    NASA Astrophysics Data System (ADS)

    Wang, D.; Yao, C.; Jia, Y.; Steed, C.; Atchley, S.

    2015-12-01

    The realistic representation of key biogeophysical and biogeochemical functions is the fundamental of process-based ecosystem models. Investigating the behavior of those ecosystem functions within real-time model simulation can be a very challenging due to the complex of both model and software structure of an environmental model, such as the Accelerated Climate Model for Energy (ACME) Land Model (ALM). In this research, author will describe the urgent needs and challenges for in-situ data analysis for ALM simulations, and layouts our methods/strategies to meet these challenges. Specifically, an in-situ data analysis framework is designed to allow users interactively observe the biogeophyical and biogeochemical process during ALM simulation. There are two key components in this framework, automatically instrumented ecosystem simulation, in-situ data communication and large-scale data exploratory toolkit. This effort is developed by leveraging several active projects, including scientific unit testing platform, common communication interface and extreme-scale data exploratory toolkit. Authors believe that, based on advanced computing technologies, such as compiler-based software system analysis, automatic code instrumentation, and in-memory data transport, this software system provides not only much needed capability for real-time observation and in-situ data analytics for environmental model simulation, but also the potentials for in-situ model behavior adjustment via simulation steering.

  13. An Exploration into the Uses of Agent-Based Modeling to Improve Quality of Healthcare

    NASA Astrophysics Data System (ADS)

    Kanagarajah, Ashok Kay; Lindsay, Peter; Miller, Anne; Parker, David

    Healthcare is a complex adaptive system. This paper discusses, healthcare in the context of complex systems architecture and an agent based modeling framework. The paper demonstrates complications of healthcare system improvement and it's impact on patient safety, economics and workloads. Further an application of safety dynamics model proposed by Cook and Rasmussen4 is explored using a hypothetical simulation of an emergency department. By means of simulation, this paper demonstrates the nonlinear behaviors of a health service unit and its complexities; and how the safety dynamic model may be used to evaluate various aspects of healthcare. Further work is required to apply this concept in a `real life environment' and its consequence to societal, organizational and operational levels of healthcare.

  14. An Exploration into the Uses of Agent-Based Modeling to Improve Quality of Healthcare

    NASA Astrophysics Data System (ADS)

    Kanagarajah, Ashok Kay; Lindsay, Peter; Miller, Anne; Parker, David

    Healthcare is a complex adaptive system. This paper discusses, healthcare in the context of complex systems architecture and an agent based modeling framework. The paper demonstrates complications of healthcare system improvement and it's impact on patient safety, economics and workloads. Further an application of safety dynamics model proposed by Cook and Rasmussen4 is explored using a hypothetical simulation of an emergency department. By means of simulation, this paper demonstrates the nonlinear behaviors of a health service unit and its complexities; and how the safety dynamic model may be used to evaluate various aspects of healthcare. Further work is required to apply this concept in a 'real life environment' and its consequence to societal, organizational and operational levels of healthcare.

  15. Health care supply networks in tightly and loosely coupled structures: exploration using agent-based modelling

    NASA Astrophysics Data System (ADS)

    Kanagarajah, A.; Parker, D.; Xu, H.

    2010-03-01

    Health care supply networks are multi-faceted complex structures. This article discusses architecture of complex systems and an agent-based modelling framework to study health care supply networks and their impact on patient safety, economics, and workloads. Here we demonstrate the application of a safety dynamics model proposed by Cook and Rasmussen (2005, '"Going Solid": A Model of System Dynamics and Consequences for Patient Safety', Quality & Safety in Health Care, 14, 67-84.) to study a health care system, using a hypothetical simulation of an emergency department as a representative unit and its dynamic behaviour. By means of simulation, this article demonstrates the non-linear behaviours of a health service unit and its complexities; and how the safety dynamic model may be used to evaluate the various policy and design aspects of health care supply networks.

  16. Symphony: a framework for accurate and holistic WSN simulation.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  17. Symphony: A Framework for Accurate and Holistic WSN Simulation

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2015-01-01

    Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144

  18. Etomica: an object-oriented framework for molecular simulation.

    PubMed

    Schultz, Andrew J; Kofke, David A

    2015-03-30

    We describe the design of an object-oriented library of software components that are suitable for constructing simulations of systems of interacting particles. The emphasis of the discussion is on the general design of the components and how they interact, and less on details of the programming interface or its implementation. Example code is provided as an aid to understanding object-oriented programming structures and to demonstrate how the framework is applied. PMID:25565378

  19. Velo: A Knowledge Management Framework for Modeling and Simulation

    SciTech Connect

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  20. An Agent-Based Model of Farmer Decision Making in Jordan

    NASA Astrophysics Data System (ADS)

    Selby, Philip; Medellin-Azuara, Josue; Harou, Julien; Klassert, Christian; Yoon, Jim

    2016-04-01

    We describe an agent based hydro-economic model of groundwater irrigated agriculture in the Jordan Highlands. The model employs a Multi-Agent-Simulation (MAS) framework and is designed to evaluate direct and indirect outcomes of climate change scenarios and policy interventions on farmer decision making, including annual land use, groundwater use for irrigation, and water sales to a water tanker market. Land use and water use decisions are simulated for groups of farms grouped by location and their behavioural and economic similarities. Decreasing groundwater levels, and the associated increase in pumping costs, are important drivers for change within Jordan'S agricultural sector. We describe how this is considered by coupling of agricultural and groundwater models. The agricultural production model employs Positive Mathematical Programming (PMP), a method for calibrating agricultural production functions to observed planted areas. PMP has successfully been used with disaggregate models for policy analysis. We adapt the PMP approach to allow explicit evaluation of the impact of pumping costs, groundwater purchase fees and a water tanker market. The work demonstrates the applicability of agent-based agricultural decision making assessment in the Jordan Highlands and its integration with agricultural model calibration methods. The proposed approach is designed and implemented with software such that it could be used to evaluate a variety of physical and human influences on decision making in agricultural water management.

  1. Framework Application for Core Edge Transport Simulation (FACETS)

    SciTech Connect

    Krasheninnikov, Sergei; Pigarov, Alexander

    2011-10-15

    The FACETS (Framework Application for Core-Edge Transport Simulations) project of Scientific Discovery through Advanced Computing (SciDAC) Program was aimed at providing a high-fidelity whole-tokamak modeling for the U.S. magnetic fusion energy program and ITER through coupling separate components for each of the core region, edge region, and wall, with realistic plasma particles and power sources and turbulent transport simulation. The project also aimed at developing advanced numerical algorithms, efficient implicit coupling methods, and software tools utilizing the leadership class computing facilities under Advanced Scientific Computing Research (ASCR). The FACETS project was conducted by a multi-discipline, multi-institutional teams, the Lead PI was J.R. Cary (Tech-X Corp.). In the FACETS project, the Applied Plasma Theory Group at the MAE Department of UCSD developed the Wall and Plasma-Surface Interaction (WALLPSI) module, performed its validation against experimental data, and integrated it into the developed framework. WALLPSI is a one-dimensional, coarse grained, reaction/advection/diffusion code applied to each material boundary cell in the common modeling domain for a tokamak. It incorporates an advanced model for plasma particle transport and retention in the solid matter of plasma facing components, simulation of plasma heat power load handling, calculation of erosion/deposition, and simulation of synergistic effects in strong plasma-wall coupling.

  2. A hybrid parallel framework for the cellular Potts model simulations

    SciTech Connect

    Jiang, Yi; He, Kejing; Dong, Shoubin

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  3. LIPID11: a modular framework for lipid simulations using amber.

    PubMed

    Skjevik, Åge A; Madej, Benjamin D; Walker, Ross C; Teigen, Knut

    2012-09-13

    Accurate simulation of complex lipid bilayers has long been a goal in condensed phase molecular dynamics (MD). Structure and function of membrane-bound proteins are highly dependent on the lipid bilayer environment and are challenging to study through experimental methods. Within Amber, there has been limited focus on lipid simulations, although some success has been seen with the use of the General Amber Force Field (GAFF). However, to date there are no dedicated Amber lipid force fields. In this paper we describe a new charge derivation strategy for lipids consistent with the Amber RESP approach and a new atom and residue naming and type convention. In the first instance, we have combined this approach with GAFF parameters. The result is LIPID11, a flexible, modular framework for the simulation of lipids that is fully compatible with the existing Amber force fields. The charge derivation procedure, capping strategy, and nomenclature for LIPID11, along with preliminary simulation results and a discussion of the planned long-term parameter development are presented here. Our findings suggest that LIPID11 is a modular framework feasible for phospholipids and a flexible starting point for the development of a comprehensive, Amber-compatible lipid force field. PMID:22916730

  4. A theoretical framework for simulation in nursing: answering Schiavenato's call.

    PubMed

    Harris, Kevin; Eccles, David W; Ward, Paul; Whyte, James

    2013-01-01

    The aim of this article was to provide a response that supports and extends Schiavenato's call for a theoretically guided approach to simulation use in nursing education.We propose that a theoretical framework for simulation In nursing must first include, as a basis, a theoretical understanding of human performance and how it is enhanced.This understanding will, in turn, allow theorists to provide a framework regarding the utility, application, and design of the training environment, including internal and external validity. The expert performance approach, a technique that recently has been termed Expert-Performance-based Training (ExPerT), is introduced as a guiding frame work for addressing these training needs. We also describe how the theory of deliberate practice within the framework of ExPerT can be useful for developing effective training methods in health care domains and highlight examples of how deliberate practice has been successfully applied to the training of psychomotor and cognitive skills. PMID:23393661

  5. Seawater Pervaporation through Zeolitic Imidazolate Framework Membranes: Atomistic Simulation Study.

    PubMed

    Gupta, Krishna M; Qiao, Zhiwei; Zhang, Kang; Jiang, Jianwen

    2016-06-01

    An atomistic simulation study is reported for seawater pervaporation through five zeolitic imidazolate framework (ZIF) membranes including ZIF-8, -93, -95, -97, and -100. Salt rejection in the five ZIFs is predicted to be 100%. With the largest aperture, ZIF-100 possesses the highest water permeability of 5 × 10(-4) kg m/(m(2) h bar), which is substantially higher compared to commercial reverse osmosis membranes, as well as zeolite and graphene oxide pervaporation membranes. In ZIF-8, -93, -95, and -97 with similar aperture size, water flux is governed by framework hydrophobicity/hydrophilicity; in hydrophobic ZIF-8 and -95, water flux is higher than in hydrophilic ZIF-93 and -97. Furthermore, water molecules in ZIF-93 move slowly and remain in the membrane for a long time but undergo to-and-fro motion in ZIF-100. The lifetime of hydrogen bonds in ZIF-93 is found to be longer than in ZIF-100. This simulation study quantitatively elucidates the dynamic and structural properties of water in ZIF membranes, identifies the key governing factors (aperture size and framework hydrophobicity/hydrophilicity), and suggests that ZIF-100 is an intriguing membrane for seawater pervaporation. PMID:27195441

  6. A framework for control simulations using the TRANSP code

    NASA Astrophysics Data System (ADS)

    Boyer, Mark D.; Andre, Rob; Gates, David; Gerhardt, Stefan; Goumiri, Imene; Menard, Jon

    2014-10-01

    The high-performance operational goals of present-day and future tokamaks will require development of advanced feedback control algorithms. Though reduced models are often used for initial designs, it is important to study the performance of control schemes with integrated models prior to experimental implementation. To this end, a flexible framework for closed loop simulations within the TRANSP code is being developed. The framework exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc.). These calculations, along with the acquisition of ``real-time'' measurements and manipulation of TRANSP internal variables based on actuator requests, are implemented through a hook that allows custom run-specific code to be inserted into the standard TRANSP source code. As part of the framework, a module has been created to constrain the thermal stored energy in TRANSP using a confinement scaling expression. Progress towards feedback control of the current profile on NSTX-U will be presented to demonstrate the framework. Supported in part by an appointment to the U.S. Department of Energy Fusion Energy Postdoctoral Research Program administered by the Oak Ridge Institute for Science and Education.

  7. A Fast Iterated Orthogonal Projection Framework for Smoke Simulation.

    PubMed

    Yang, Yang; Yang, Xubo; Yang, Shuangcai

    2016-05-01

    We present a fast iterated orthogonal projection (IOP) framework for smoke simulations. By modifying the IOP framework with a different means for convergence, our framework significantly reduces the number of iterations required to converge to the desired precision. Our new iteration framework adds a divergence redistributor component to IOP that can improve the impeded convergence logic of IOP. We tested Jacobi, GS and SOR as divergence redistributors and used the Multigrid scheme to generate a highly efficient Poisson solver. It provides a rapid convergence rate and requires less computation time. In all of our experiments, our method only requires 2-3 iterations to satisfy the convergence condition of 1e-5 and 5-7 iterations for 1e-10. Compared with the commonly used Incomplete Cholesky Preconditioned Conjugate Gradient(ICPCG) solver, our Poisson solver accelerates the overall speed to approximately 7- to 30-fold faster for grids ranging from 128(3) to 256(3). Our solver can accelerate more on larger grids because of the property that the iteration count required to satisfy the convergence condition is independent of the problem size. We use various experimental scenes and settings to demonstrate the efficiency of our method. In addition, we present a feasible method for both IOP and our fast IOP to support free surfaces. PMID:27045907

  8. A Run-Time Verification Framework for Smart Grid Applications Implemented on Simulation Frameworks

    SciTech Connect

    Ciraci, Selim; Sozer, Hasan; Tekinerdogan, Bedir

    2013-05-18

    Smart grid applications are implemented and tested with simulation frameworks as the developers usually do not have access to large sensor networks to be used as a test bed. The developers are forced to map the implementation onto these frameworks which results in a deviation between the architecture and the code. On its turn this deviation makes it hard to verify behavioral constraints that are de- scribed at the architectural level. We have developed the ConArch toolset to support the automated verification of architecture-level behavioral constraints. A key feature of ConArch is programmable mapping for architecture to the implementation. Here, developers implement queries to identify the points in the target program that correspond to architectural interactions. ConArch generates run- time observers that monitor the flow of execution between these points and verifies whether this flow conforms to the behavioral constraints. We illustrate how the programmable mappings can be exploited for verifying behavioral constraints of a smart grid appli- cation that is implemented with two simulation frameworks.

  9. A modeling and simulation framework for electrokinetic nanoparticle treatment

    NASA Astrophysics Data System (ADS)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  10. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    SciTech Connect

    Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels of parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.

  11. Faunus: An object oriented framework for molecular simulation

    PubMed Central

    Lund, Mikael; Trulsson, Martin; Persson, Björn

    2008-01-01

    Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331

  12. Agent-based models of financial markets

    NASA Astrophysics Data System (ADS)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  13. A wind turbine hybrid simulation framework considering aeroelastic effects

    NASA Astrophysics Data System (ADS)

    Song, Wei; Su, Weihua

    2015-04-01

    In performing an effective structural analysis for wind turbine, the simulation of turbine aerodynamic loads is of great importance. The interaction between the wake flow and the blades may impact turbine blades loading condition, energy yield and operational behavior. Direct experimental measurement of wind flow field and wind profiles around wind turbines is very helpful to support the wind turbine design. However, with the growth of the size of wind turbines for higher energy output, it is not convenient to obtain all the desired data in wind-tunnel and field tests. In this paper, firstly the modeling of dynamic responses of large-span wind turbine blades will consider nonlinear aeroelastic effects. A strain-based geometrically nonlinear beam formulation will be used for the basic structural dynamic modeling, which will be coupled with unsteady aerodynamic equations and rigid-body rotations of the rotor. Full wind turbines can be modeled by using the multi-connected beams. Then, a hybrid simulation experimental framework is proposed to potentially address this issue. The aerodynamic-dominant components, such as the turbine blades and rotor, are simulated as numerical components using the nonlinear aeroelastic model; while the turbine tower, where the collapse of failure may occur under high level of wind load, is simulated separately as the physical component. With the proposed framework, dynamic behavior of NREL's 5MW wind turbine blades will be studied and correlated with available numerical data. The current work will be the basis of the authors' further studies on flow control and hazard mitigation on wind turbine blades and towers.

  14. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation.

    PubMed

    Mangado, Nerea; Ceresa, Mario; Duchateau, Nicolas; Kjer, Hans Martin; Vera, Sergio; Dejea Velardo, Hector; Mistrik, Pavel; Paulsen, Rasmus R; Fagertun, Jens; Noailly, Jérôme; Piella, Gemma; González Ballester, Miguel Ángel

    2016-08-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns constitutive parameters to all components of the finite element model. This model can then be used to study in silico the effects of the electrical stimulation of the cochlear implant. Results are shown on a total of 25 models of patients. In all cases, a final mesh suitable for finite element simulations was obtained, in an average time of 94 s. The framework has proven to be fast and robust, and is promising for a detailed prognosis of the cochlear implantation surgery. PMID:26715210

  15. Interactive agent based modeling of public health decision-making.

    PubMed

    Parks, Amanda L; Walker, Brett; Pettey, Warren; Benuzillo, Jose; Gesteland, Per; Grant, Juliana; Koopman, James; Drews, Frank; Samore, Matthew

    2009-01-01

    Agent-based models have yielded important insights regarding the transmission dynamics of communicable diseases. To better understand how these models can be used to study decision making of public health officials, we developed a computer program that linked an agent-based model of pertussis with an agent-based model of public health management. The program, which we call the Public Health Interactive Model & simulation (PHIMs) encompassed the reporting of cases to public health, case investigation, and public health response. The user directly interacted with the model in the role of the public health decision-maker. In this paper we describe the design of our model, and present the results of a pilot study to assess its usability and potential for future development. Affinity for specific tools was demonstrated. Participants ranked the program high in usability and considered it useful for training. Our ultimate goal is to achieve better public health decisions and outcomes through use of public health decision support tools. PMID:20351907

  16. HYPERS: A Unidimensional Asynchronous Framework for Multiscale Hybrid Simulations

    NASA Astrophysics Data System (ADS)

    Omelchenko, Y. A.; Karimabadi, H.; Vu, H. X.

    2011-12-01

    Kinetic ion-driven processes are crucial for understanding the complex dynamics of the closely coupled Earth magnetosphere-ionosphere system. Largely varying time and length scales impose severe numerical constraints on global simulations with hybrid (particle ions + fluid electrons) codes. To enable larger simulations we developed a unique, uni-dimensional multiscale hybrid code, HYPERS. Instead of stepping all simulation variables uniformly in time, HYPERS tracks physically meaningful changes to individual particles and cell-based electromagnetic fields via asynchronous discrete events. HYPERS has been parallelized with the Preemptive Event Processing (PEP) technique. The parallel algorithm enables arbitrary domain decompositions and processor configurations on restarts. This is a critical prerequisite for implementing a full load balancing functionality in the future. We validate HYPERS by simulating the interaction of streaming plasmas with dipole magnetospheres and show that our approach results in superior numerical metrics (stability, accuracy and speed) compared to conventional techniques. We also discuss further extensions to the HYPERS framework that would enable seemless integration of ion fluid and kinetic schemes.

  17. A Virtual Engineering Framework for Simulating Advanced Power System

    SciTech Connect

    Mike Bockelie; Dave Swensen; Martin Denison; Stanislav Borodai

    2008-06-18

    In this report is described the work effort performed to provide NETL with VE-Suite based Virtual Engineering software and enhanced equipment models to support NETL's Advanced Process Engineering Co-simulation (APECS) framework for advanced power generation systems. Enhancements to the software framework facilitated an important link between APECS and the virtual engineering capabilities provided by VE-Suite (e.g., equipment and process visualization, information assimilation). Model enhancements focused on improving predictions for the performance of entrained flow coal gasifiers and important auxiliary equipment (e.g., Air Separation Units) used in coal gasification systems. In addition, a Reduced Order Model generation tool and software to provide a coupling between APECS/AspenPlus and the GE GateCycle simulation system were developed. CAPE-Open model interfaces were employed where needed. The improved simulation capability is demonstrated on selected test problems. As part of the project an Advisory Panel was formed to provide guidance on the issues on which to focus the work effort. The Advisory Panel included experts from industry and academics in gasification, CO2 capture issues, process simulation and representatives from technology developers and the electric utility industry. To optimize the benefit to NETL, REI coordinated its efforts with NETL and NETL funded projects at Iowa State University, Carnegie Mellon University and ANSYS/Fluent, Inc. The improved simulation capabilities incorporated into APECS will enable researchers and engineers to better understand the interactions of different equipment components, identify weaknesses and processes needing improvement and thereby allow more efficient, less expensive plants to be developed and brought on-line faster and in a more cost-effective manner. These enhancements to APECS represent an important step toward having a fully integrated environment for performing plant simulation and engineering

  18. Agent Based Modeling of Air Carrier Behavior for Evaluation of Technology Equipage and Adoption

    NASA Technical Reports Server (NTRS)

    Horio, Brant M.; DeCicco, Anthony H.; Stouffer, Virginia L.; Hasan, Shahab; Rosenbaum, Rebecca L.; Smith, Jeremy C.

    2014-01-01

    As part of ongoing research, the National Aeronautics and Space Administration (NASA) and LMI developed a research framework to assist policymakers in identifying impacts on the U.S. air transportation system (ATS) of potential policies and technology related to the implementation of the Next Generation Air Transportation System (NextGen). This framework, called the Air Transportation System Evolutionary Simulation (ATS-EVOS), integrates multiple models into a single process flow to best simulate responses by U.S. commercial airlines and other ATS stakeholders to NextGen-related policies, and in turn, how those responses impact the ATS. Development of this framework required NASA and LMI to create an agent-based model of airline and passenger behavior. This Airline Evolutionary Simulation (AIRLINE-EVOS) models airline decisions about tactical airfare and schedule adjustments, and strategic decisions related to fleet assignments, market prices, and equipage. AIRLINE-EVOS models its own heterogeneous population of passenger agents that interact with airlines; this interaction allows the model to simulate the cycle of action-reaction as airlines compete with each other and engage passengers. We validated a baseline configuration of AIRLINE-EVOS against Airline Origin and Destination Survey (DB1B) data and subject matter expert opinion, and we verified the ATS-EVOS framework and agent behavior logic through scenario-based experiments. These experiments demonstrated AIRLINE-EVOS's capabilities in responding to an input price shock in fuel prices, and to equipage challenges in a series of analyses based on potential incentive policies for best equipped best served, optimal-wind routing, and traffic management initiative exemption concepts..

  19. Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis

    PubMed Central

    2016-01-01

    Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website. PMID:27340402

  20. Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis.

    PubMed

    Kurhekar, Manish; Deshpande, Umesh

    2016-01-01

    Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website. PMID:27340402

  1. Multiscale agent-based consumer market modeling.

    SciTech Connect

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  2. Assurance in Agent-Based Systems

    SciTech Connect

    Gilliom, Laura R.; Goldsmith, Steven Y.

    1999-05-10

    Our vision of the future of information systems is one that includes engineered collectives of software agents which are situated in an environment over years and which increasingly improve the performance of the overall system of which they are a part. At a minimum, the movement of agent and multi-agent technology into National Security applications, including their use in information assurance, is apparent today. The use of deliberative, autonomous agents in high-consequence/high-security applications will require a commensurate level of protection and confidence in the predictability of system-level behavior. At Sandia National Laboratories, we have defined and are addressing a research agenda that integrates the surety (safety, security, and reliability) into agent-based systems at a deep level. Surety is addressed at multiple levels: The integrity of individual agents must be protected by addressing potential failure modes and vulnerabilities to malevolent threats. Providing for the surety of the collective requires attention to communications surety issues and mechanisms for identifying and working with trusted collaborators. At the highest level, using agent-based collectives within a large-scale distributed system requires the development of principled design methods to deliver the desired emergent performance or surety characteristics. This position paper will outline the research directions underway at Sandia, will discuss relevant work being performed elsewhere, and will report progress to date toward assurance in agent-based systems.

  3. A Hybrid Sensitivity Analysis Approach for Agent-based Disease Spread Models

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. Of particular interest lately is the application of agent-based and hybrid models to epidemiology, specifically Agent-based Disease Spread Models (ABDSM). Validation (one aspect of the means to achieve dependability) of ABDSM simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. In this report, we describe our preliminary efforts in ABDSM validation by using hybrid model fusion technology.

  4. Agent based simulations in disease modeling Comment on "Towards a unified approach in the modeling of fibrosis: A review with research perspectives" by Martine Ben Amar and Carlo Bianca

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Pennisi, Marzio

    2016-07-01

    Fibrosis represents a process where an excessive tissue formation in an organ follows the failure of a physiological reparative or reactive process. Mathematical and computational techniques may be used to improve the understanding of the mechanisms that lead to the disease and to test potential new treatments that may directly or indirectly have positive effects against fibrosis [1]. In this scenario, Ben Amar and Bianca [2] give us a broad picture of the existing mathematical and computational tools that have been used to model fibrotic processes at the molecular, cellular, and tissue levels. Among such techniques, agent based models (ABM) can give a valuable contribution in the understanding and better management of fibrotic diseases.

  5. Agent-based model of macrophage action on endocrine pancreas.

    PubMed

    Martínez, Ignacio V; Gómez, Enrique J; Hernando, M Elena; Villares, Ricardo; Mellado, Mario

    2012-01-01

    This paper proposes an agent-based model of the action of macrophages on the beta cells of the endocrine pancreas. The aim of this model is to simulate the processes of beta cell proliferation and apoptosis and also the process of phagocytosis of cell debris by macrophages, all of which are related to the onset of the autoimmune response in type 1 diabetes. We have used data from the scientific literature to design the model. The results show that the model obtains good approximations to real processes and could be used to shed light on some open questions concerning such processes. PMID:23155767

  6. Agent-based modeling of urban land-use change

    NASA Astrophysics Data System (ADS)

    Li, Xinyan; Li, Deren

    2005-10-01

    ABM (Agent-Based Modeling) is a newly developed method of computer simulation. It has characteristics such as active, dynamic, and operational. Urban land-use change has been a focus problem all over the world, especially for the developing countries. We try to use ABM to model the urban land-use changes. By studying the mechanism of urban land use evolvement, we put forwards the thinking of modeling. And an urban land-use change model is built primarily based on the RePast software and GIS spatial database.

  7. An Agent Based Model for Social Class Emergence

    NASA Astrophysics Data System (ADS)

    Yang, Xiaoxiang; Rodriguez Segura, Daniel; Lin, Fei; Mazilu, Irina

    We present an open system agent-based model to analyze the effects of education and the society-specific wealth transactions on the emergence of social classes. Building on previous studies, we use realistic functions to model how years of education affect the income level. Numerical simulations show that the fraction of an individual's total transactions that is invested rather than consumed can cause wealth gaps between different income brackets in the long run. In an attempt to incorporate the network effects, we also explore how the probability of interactions among agents depending on the spread of their income brackets affects wealth distribution.

  8. Framework Application for Core Edge Transport Simulation (FACETS)

    SciTech Connect

    Malony, Allen D; Shende, Sameer S; Huck, Kevin A; Mr. Alan Morris, and Mr. Wyatt Spear

    2012-03-14

    The goal of the FACETS project (Framework Application for Core-Edge Transport Simulations) was to provide a multiphysics, parallel framework application (FACETS) that will enable whole-device modeling for the U.S. fusion program, to provide the modeling infrastructure needed for ITER, the next step fusion confinement device. Through use of modern computational methods, including component technology and object oriented design, FACETS is able to switch from one model to another for a given aspect of the physics in a flexible manner. This enables use of simplified models for rapid turnaround or high-fidelity models that can take advantage of the largest supercomputer hardware. FACETS does so in a heterogeneous parallel context, where different parts of the application execute in parallel by utilizing task farming, domain decomposition, and/or pipelining as needed and applicable. ParaTools, Inc. was tasked with supporting the performance analysis and tuning of the FACETS components and framework in order to achieve the parallel scaling goals of the project. The TAU Performance System® was used for instrumentation, measurement, archiving, and profile / tracing analysis. ParaTools, Inc. also assisted in FACETS performance engineering efforts. Through the use of the TAU Performance System, ParaTools provided instrumentation, measurement, analysis and archival support for the FACETS project. Performance optimization of key components has yielded significant performance speedups. TAU was integrated into the FACETS build for both the full coupled application and the UEDGE component. The performance database provided archival storage of the performance regression testing data generated by the project, and helped to track improvements in the software development.

  9. Particle beam dynamics simulations using the POOMA framework

    SciTech Connect

    Humphrey, W.; Ryne, R.; Cleland, T.; Cummings, J.; Habib, S.; Mark, G.; Ji Qiang

    1998-12-31

    A program for simulation of the dynamics of high intensity charged particle beams in linear particle accelerators has been developed in C++ using the POOMA Framework, for use on serial and parallel architectures. The code models the trajectories of charged particles through a sequence of different accelerator beamline elements such as drift chambers, quadrupole magnets, or RF cavities. An FFT-based particle-in-cell algorithm is used to solve the Poisson equation that models the Coulomb interactions of the particles. The code employs an object-oriented design with software abstractions for the particle beam, accelerator beamline, and beamline elements, using C++ templates to efficiently support both 2D and 3D capabilities in the same code base. The POOMA Framework, which encapsulates much of the effort required for parallel execution, provides particle and field classes, particle-field interaction capabilities, and parallel FFT algorithms. The performance of this application running serially and in parallel is compared to an existing HPF implementation, with the POOMA version seen to run four times faster than the HPF code.

  10. A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations

    SciTech Connect

    Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; Hammond, Glenn E.

    2015-06-01

    -specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.

  11. A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations

    DOE PAGESBeta

    Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; Hammond, Glenn E.

    2015-06-01

    -specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.« less

  12. The hydraulic system of trees: theoretical framework and numerical simulation

    PubMed

    Fruh; Kurth

    1999-12-21

    Empirical studies pose the problem of the physiological integration of the tree organism, which is also important on the scale of ecosystems. Recently, spatially distributed models emerged, which approach this problem by reflecting the close linkage between physiological processes and the structures of trees and tree stands. In the case of water flow, the tree organism can be regarded as hydraulic system and the branched tree architecture as hydraulic network. Previous models of the hydraulic system either did not take into account the network structure, or they had shortcomings regarding the translation of the underlying physiological assumptions by the discrete computation method. We have developed a theoretical framework which takes the form of a numerical simulation model of tree water flow. A discrete initial boundary value problem (IBVP) combines the phenomena of Darcy flow, water storage and conductivity losses in the hydraulic network. The software HYDRA computes the solution of the IBVP. The theoretical derivation and model tests corroborate the consistent translation of the physiological assumptions by the computational method. Simulation studies enabled us to formulate hypotheses on the following points: (1) differences in the hydraulic segmentation between Picea abies and Thuja occidentalis, (2) responses of the hydraulic system to rapid transpiration changes and to a scenario of drought stress, and (3) how these responses depend on architectural quantities of the trees. The simulation studies demonstrated our possibilities of deriving theoretically well-founded hypotheses about the functioning of the hydraulic system and its relation to system structure. The numerical simulation model is designed as a tool for structure-function studies, which is able to treat tree architecture as independent variable. The model supports the integration of data on tree level, and it can be used for computer experiments which quantify the dynamics of the hydraulic

  13. An agent-based multilayer architecture for bioinformatics grids.

    PubMed

    Bartocci, Ezio; Cacciagrano, Diletta; Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Milanesi, Luciano; Romano, Paolo

    2007-06-01

    Due to the huge volume and complexity of biological data available today, a fundamental component of biomedical research is now in silico analysis. This includes modelling and simulation of biological systems and processes, as well as automated bioinformatics analysis of high-throughput data. The quest for bioinformatics resources (including databases, tools, and knowledge) becomes therefore of extreme importance. Bioinformatics itself is in rapid evolution and dedicated Grid cyberinfrastructures already offer easier access and sharing of resources. Furthermore, the concept of the Grid is progressively interleaving with those of Web Services, semantics, and software agents. Agent-based systems can play a key role in learning, planning, interaction, and coordination. Agents constitute also a natural paradigm to engineer simulations of complex systems like the molecular ones. We present here an agent-based, multilayer architecture for bioinformatics Grids. It is intended to support both the execution of complex in silico experiments and the simulation of biological systems. In the architecture a pivotal role is assigned to an "alive" semantic index of resources, which is also expected to facilitate users' awareness of the bioinformatics domain. PMID:17695749

  14. From Compartmentalized to Agent-based Models of Epidemics

    NASA Astrophysics Data System (ADS)

    Macal, Charles

    Supporting decisions in the throes of an impending epidemic poses distinct technical challenges arising from the uncertainties in modeling disease propagation processes and the need for producing timely answers to policy questions. Compartmental models, because of their relative simplicity, produce timely information, but often do not include the level of fidelity of the information needed to answer specific policy questions. Highly granular agent-based simulations produce an extensive amount of information on all aspects of a simulated epidemic, yet complex models often cannot produce this information in a timely manner. We propose a two-phased approach to addressing the tradeoff between model complexity and the speed at which models can be used to answer to questions about an impending outbreak. In the first phase, in advance of an epidemic, ensembles of highly granular agent-based simulations are run over the entire parameter space, characterizing the space of possible model outcomes and uncertainties. Meta-models are derived that characterize model outcomes as dependent on uncertainties in disease parameters, data, and structural relationships. In the second phase, envisioned as during an epidemic, the meta-model is run in combination with compartmental models, which can be run very quickly. Model outcomes are compared as a basis for establishing uncertainties in model forecasts. This work is supported by the U.S. Department of Energy under Contract number DE-AC02-06CH11357 and National Science Foundation (NSF) RAPID Award DEB-1516428.

  15. Reducing complexity in an agent based reaction model-Benefits and limitations of simplifications in relation to run time and system level output.

    PubMed

    Rhodes, David M; Holcombe, Mike; Qwarnstrom, Eva E

    2016-09-01

    Agent based modelling is a methodology for simulating a variety of systems across a broad spectrum of fields. However, due to the complexity of the systems it is often impossible or impractical to model them at a one to one scale. In this paper we use a simple reaction rate model implemented using the FLAME framework to test the impact of common methods for reducing model complexity such as reducing scale, increasing iteration duration and reducing message overheads. We demonstrate that such approaches can have significant impact on simulation runtime albeit with increasing risk of aberrant system behaviour and errors, as the complexity of the model is reduced. PMID:27297544

  16. Techniques and Issues in Agent-Based Modeling Validation

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

  17. Coupling Agent-Based and Groundwater Modeling to Explore Demand Management Strategies for Shared Resources

    NASA Astrophysics Data System (ADS)

    Al-Amin, S.

    2015-12-01

    Municipal water demands in growing population centers in the arid southwest US are typically met through increased groundwater withdrawals. Hydro-climatic uncertainties attributed to climate change and land use conversions may also alter demands and impact the replenishment of groundwater supply. Groundwater aquifers are not necessarily confined within municipal and management boundaries, and multiple diverse agencies may manage a shared resource in a decentralized approach, based on individual concerns and resources. The interactions among water managers, consumers, and the environment influence the performance of local management strategies and regional groundwater resources. This research couples an agent-based modeling (ABM) framework and a groundwater model to analyze the effects of different management approaches on shared groundwater resources. The ABM captures the dynamic interactions between household-level consumers and policy makers to simulate water demands under climate change and population growth uncertainties. The groundwater model is used to analyze the relative effects of management approaches on reducing demands and replenishing groundwater resources. The framework is applied for municipalities located in the Verde River Basin, Arizona that withdraw groundwater from the Verde Formation-Basin Fill-Carbonate aquifer system. Insights gained through this simulation study can be used to guide groundwater policy-making under changing hydro-climatic scenarios for a long-term planning horizon.

  18. An agent-based microsimulation of critical infrastructure systems

    SciTech Connect

    BARTON,DIANNE C.; STAMBER,KEVIN L.

    2000-03-29

    US infrastructures provide essential services that support the economic prosperity and quality of life. Today, the latest threat to these infrastructures is the increasing complexity and interconnectedness of the system. On balance, added connectivity will improve economic efficiency; however, increased coupling could also result in situations where a disturbance in an isolated infrastructure unexpectedly cascades across diverse infrastructures. An understanding of the behavior of complex systems can be critical to understanding and predicting infrastructure responses to unexpected perturbation. Sandia National Laboratories has developed an agent-based model of critical US infrastructures using time-dependent Monte Carlo methods and a genetic algorithm learning classifier system to control decision making. The model is currently under development and contains agents that represent the several areas within the interconnected infrastructures, including electric power and fuel supply. Previous work shows that agent-based simulations models have the potential to improve the accuracy of complex system forecasting and to provide new insights into the factors that are the primary drivers of emergent behaviors in interdependent systems. Simulation results can be examined both computationally and analytically, offering new ways of theorizing about the impact of perturbations to an infrastructure network.

  19. Reaction to Extreme Events in a Minimal Agent Based Model

    NASA Astrophysics Data System (ADS)

    Zaccaria, Andrea; Cristelli, Matthieu; Pietronero, Luciano

    We consider the issue of the overreaction of financial markets to a sudden price change. In particular, we focus on the price and the population dynamics which follows a large fluctuation. In order to investigate these aspects from different perspectives we discuss the known results for empirical data, the Lux-Marchesi model and a minimal agent based model which we have recently proposed. We show that, in this framework, the presence of a overreaction is deeply linked to the population dynamics. In particular, the presence of a destabilizing strategy in the market is a necessary condition to have an overshoot with respect to the exogenously induced price fluctuation. Finally, we analyze how the memory of the agents can quantitatively affect this behavior.

  20. Dynamic Gaussian wake meandering in a restricted nonlinear simulation framework

    NASA Astrophysics Data System (ADS)

    Bretheim, Joel; Porte-Agel, Fernando; Gayme, Dennice; Meneveau, Charles

    2015-11-01

    Wake meandering can significantly impact the performance of large-scale wind farms. Simplified wake expansion (e.g., Jensen/PARK) models, which are commonly used in industry, lead to accurate predictions of certain wind farm performance characteristics (e.g., time- and row-averaged total power output). However, they are unable to capture certain temporal phenomena such as wake meandering, which can have profound effects on both power output and turbine loading. We explore a dynamic wake modeling framework based on the approach proposed by Larsen et al. (Wind Energy 11, 2008) whereby turbine ``wake elements'' are treated as passive tracers and advected by an averaged streamwise flow. Our wake elements are treated as Gaussian velocity deficit profiles (Bastankhah and Porte-Agel, Renew. Energy 70, 2014). A restricted nonlinear (RNL) model is used to capture the turbulent velocity fluctuations that are critical to the wake meandering phenomenon. The RNL system, which has been used in prior wall-turbulence studies, provides a computationally affordable way to model atmospheric turbulence, making it more reasonable for use in engineering models than the more accurate but computationally intensive approaches like large-eddy simulation. This work is supported by NSF (IGERT 0801471, SEP-1230788, and IIA-1243482, the WINDINSPIRE project).

  1. Multiscale Simulation Framework for Coupled Fluid Flow and Mechanical Deformation

    SciTech Connect

    Tchelepi, Hamdi

    2014-11-14

    A multiscale linear-solver framework for the pressure equation associated with flow in highly heterogeneous porous formations was developed. The multiscale based approach is cast in a general algebraic form, which facilitates integration of the new scalable linear solver in existing flow simulators. The Algebraic Multiscale Solver (AMS) is employed as a preconditioner within a multi-stage strategy. The formulations investigated include the standard MultiScale Finite-Element (MSFE) andMultiScale Finite-Volume (MSFV) methods. The local-stage solvers include incomplete factorization and the so-called Correction Functions (CF) associated with the MSFV approach. Extensive testing of AMS, as an iterative linear solver, indicate excellent convergence rates and computational scalability. AMS compares favorably with advanced Algebraic MultiGrid (AMG) solvers for highly detailed three-dimensional heterogeneous models. Moreover, AMS is expected to be especially beneficial in solving time-dependent problems of coupled multiphase flow and transport in large-scale subsurface formations.

  2. An agent-based approach to financial stylized facts

    NASA Astrophysics Data System (ADS)

    Shimokawa, Tetsuya; Suzuki, Kyoko; Misawa, Tadanobu

    2007-06-01

    An important challenge of the financial theory in recent years is to construct more sophisticated models which have consistencies with as many financial stylized facts that cannot be explained by traditional models. Recently, psychological studies on decision making under uncertainty which originate in Kahneman and Tversky's research attract a lot of interest as key factors which figure out the financial stylized facts. These psychological results have been applied to the theory of investor's decision making and financial equilibrium modeling. This paper, following these behavioral financial studies, would like to propose an agent-based equilibrium model with prospect theoretical features of investors. Our goal is to point out a possibility that loss-averse feature of investors explains vast number of financial stylized facts and plays a crucial role in price formations of financial markets. Price process which is endogenously generated through our model has consistencies with, not only the equity premium puzzle and the volatility puzzle, but great kurtosis, asymmetry of return distribution, auto-correlation of return volatility, cross-correlation between return volatility and trading volume. Moreover, by using agent-based simulations, the paper also provides a rigorous explanation from the viewpoint of a lack of market liquidity to the size effect, which means that small-sized stocks enjoy excess returns compared to large-sized stocks.

  3. Agent based modeling of "crowdinforming" as a means of load balancing at emergency departments.

    PubMed

    Neighbour, Ryan; Oppenheimer, Luis; Mukhi, Shamir N; Friesen, Marcia R; McLeod, Robert D

    2010-01-01

    This work extends ongoing development of a framework for modeling the spread of contact-transmission infectious diseases. The framework is built upon Agent Based Modeling (ABM), with emphasis on urban scale modelling integrated with institutional models of hospital emergency departments. The method presented here includes ABM modeling an outbreak of influenza-like illness (ILI) with concomitant surges at hospital emergency departments, and illustrates the preliminary modeling of 'crowdinforming' as an intervention. 'Crowdinforming', a component of 'crowdsourcing', is characterized as the dissemination of collected and processed information back to the 'crowd' via public access. The objective of the simulation is to allow for effective policy evaluation to better inform the public of expected wait times as part of their decision making process in attending an emergency department or clinic. In effect, this is a means of providing additional decision support garnered from a simulation, prior to real world implementation. The conjecture is that more optimal service delivery can be achieved under balanced patient loads, compared to situations where some emergency departments are overextended while others are underutilized. Load balancing optimization is a common notion in many operations, and the simulation illustrates that 'crowdinforming' is a potential tool when used as a process control parameter to balance the load at emergency departments as well as serving as an effective means to direct patients during an ILI outbreak with temporary clinics deployed. The information provided in the 'crowdinforming' model is readily available in a local context, although it requires thoughtful consideration in its interpretation. The extension to a wider dissemination of information via a web service is readily achievable and presents no technical obstacles, although political obstacles may be present. The 'crowdinforming' simulation is not limited to arrivals of patients at

  4. An agent-based hydroeconomic model to evaluate water policies in Jordan

    NASA Astrophysics Data System (ADS)

    Yoon, J.; Gorelick, S.

    2014-12-01

    Modern water systems can be characterized by a complex network of institutional and private actors that represent competing sectors and interests. Identifying solutions to enhance water security in such systems calls for analysis that can adequately account for this level of complexity and interaction. Our work focuses on the development of a hierarchical, multi-agent, hydroeconomic model that attempts to realistically represent complex interactions between hydrologic and multi-faceted human systems. The model is applied to Jordan, one of the most water-poor countries in the world. In recent years, the water crisis in Jordan has escalated due to an ongoing drought and influx of refugees from regional conflicts. We adopt a modular approach in which biophysical modules simulate natural and engineering phenomena, and human modules represent behavior at multiple scales of decision making. The human modules employ agent-based modeling, in which agents act as autonomous decision makers at the transboundary, state, organizational, and user levels. A systematic nomenclature and conceptual framework is used to characterize model agents and modules. Concepts from the Unified Modeling Language (UML) are adopted to promote clear conceptualization of model classes and process sequencing, establishing a foundation for full deployment of the integrated model in a scalable object-oriented programming environment. Although the framework is applied to the Jordanian water context, it is generalizable to other regional human-natural freshwater supply systems.

  5. Combining Bayesian Networks and Agent Based Modeling to develop a decision-support model in Vietnam

    NASA Astrophysics Data System (ADS)

    Nong, Bao Anh; Ertsen, Maurits; Schoups, Gerrit

    2016-04-01

    Complexity and uncertainty in natural resources management have been focus themes in recent years. Within these debates, with the aim to define an approach feasible for water management practice, we are developing an integrated conceptual modeling framework for simulating decision-making processes of citizens, in our case in the Day river area, Vietnam. The model combines Bayesian Networks (BNs) and Agent-Based Modeling (ABM). BNs are able to combine both qualitative data from consultants / experts / stakeholders, and quantitative data from observations on different phenomena or outcomes from other models. Further strengths of BNs are that the relationship between variables in the system is presented in a graphical interface, and that components of uncertainty are explicitly related to their probabilistic dependencies. A disadvantage is that BNs cannot easily identify the feedback of agents in the system once changes appear. Hence, ABM was adopted to represent the reaction among stakeholders under changes. The modeling framework is developed as an attempt to gain better understanding about citizen's behavior and factors influencing their decisions in order to reduce uncertainty in the implementation of water management policy.

  6. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  7. artG4: A Generic Framework for Geant4 Simulations

    SciTech Connect

    Arvanitis, Tasha; Lyon, Adam

    2014-01-01

    A small experiment must devote its limited computing expertise to writing physics code directly applicable to the experiment. A software 'framework' is essential for providing an infrastructure that makes writing the physics-relevant code easy. In this paper, we describe a highly modular and easy to use framework for writing Geant4 based simulations called 'artg4'. This framework is a layer on top of the art framework.

  8. An Agent-Based Model for Studying Child Maltreatment and Child Maltreatment Prevention

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard W.

    This paper presents an agent-based model that simulates the dynamics of child maltreatment and child maltreatment prevention. The developed model follows the principles of complex systems science and explicitly models a community and its families with multi-level factors and interconnections across the social ecology. This makes it possible to experiment how different factors and prevention strategies can affect the rate of child maltreatment. We present the background of this work and give an overview of the agent-based model and show some simulation results.

  9. Validating agent based models through virtual worlds.

    SciTech Connect

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  10. Convergence and optimization of agent-based coalition formation

    NASA Astrophysics Data System (ADS)

    Wang, Yuanshi; Wu, Hong

    2005-03-01

    In this paper, we analyze the model of agent-based coalition formation in markets. Our goal is to study the convergence of the coalition formation and optimize agents’ strategies. We show that the model has a unique steady state (equilibrium) and prove that all solutions converge to it in the case that the maximum size of coalitions is not larger than three. The stability of the steady state in other cases is not studied while numerical simulations are given to show the convergence. The steady state, which determines both the global system gain and the average gain per agent, is expressed by the agents’ strategies in the coalition formation. Through the steady state, we give the relationship between the gains and the agents’ strategies, and present a series of results for the optimization of agents’ strategies.

  11. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool. PMID:27372059

  12. Agent-based modeling in ecological economics.

    PubMed

    Heckbert, Scott; Baynes, Tim; Reeson, Andrew

    2010-01-01

    Interconnected social and environmental systems are the domain of ecological economics, and models can be used to explore feedbacks and adaptations inherent in these systems. Agent-based modeling (ABM) represents autonomous entities, each with dynamic behavior and heterogeneous characteristics. Agents interact with each other and their environment, resulting in emergent outcomes at the macroscale that can be used to quantitatively analyze complex systems. ABM is contributing to research questions in ecological economics in the areas of natural resource management and land-use change, urban systems modeling, market dynamics, changes in consumer attitudes, innovation, and diffusion of technology and management practices, commons dilemmas and self-governance, and psychological aspects to human decision making and behavior change. Frontiers for ABM research in ecological economics involve advancing the empirical calibration and validation of models through mixed methods, including surveys, interviews, participatory modeling, and, notably, experimental economics to test specific decision-making hypotheses. Linking ABM with other modeling techniques at the level of emergent properties will further advance efforts to understand dynamics of social-environmental systems. PMID:20146761

  13. Agent Based Model of Livestock Movements

    NASA Astrophysics Data System (ADS)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  14. A forward-muscular inverse-skeletal dynamics framework for human musculoskeletal simulations.

    PubMed

    S Shourijeh, Mohammad; Smale, Kenneth B; Potvin, Brigitte M; Benoit, Daniel L

    2016-06-14

    This study provides a forward-muscular inverse-skeletal dynamics framework for musculoskeletal simulations. The simulation framework works based on solving the muscle redundancy problem forward in time parallel to a torque tracking between the musculotendon net torques and joint moments from inverse dynamics. The proposed framework can be used by any musculoskeletal modeling software package; however, just to exemplify, here in this study it is wrapped around OpenSim and the optimization is done in MATLAB. The novel simulation framework was highly robust for repeated runs and produced relatively high correlations between predicted muscle excitations and experimental EMGs for level gait trials. This simulation framework represents an efficient and robust approach to predict muscle excitation, musculotendon unit force, and to estimate net joint torque. PMID:27106173

  15. Simulation-optimization framework for multi-season hybrid stochastic models

    NASA Astrophysics Data System (ADS)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K. P.

    2011-07-01

    SummaryA novel simulation-optimization framework is proposed that enables the automation of the hybrid stochastic modeling process for synthetic generation of multi-season streamflows. This framework aims to minimize the drudgery, judgment and subjectivity involved in the selection of the most appropriate hybrid stochastic model. It consists of a multi-objective optimization model as the driver and the hybrid multi-season stochastic streamflow generation model, hybrid matched block boostrap (HMABB) as the simulation engine. For the estimation of the hybrid model parameters, the proposed framework employs objective functions that aim to minimize the overall errors in the preservation of storage capacities at various demand levels, unlike the traditional approaches that are simulation based. Moreover this framework yields a number of competent hybrid stochastic models in a single run of the simulation-optimization framework. The efficacy of the proposed simulation-optimization framework is brought out through application to two monthly streamflow data sets from USA of varying sample sizes that exhibit multi-modality and a complex dependence structure. The results show that the hybrid models obtained from the proposed framework are able to preserve the statistical characteristics as well as the storage characteristics better than the simulation based HMABB model, while minimizing the manual effort and the subjectivity involved in the modeling process. The proposed framework can be easily extended to model multi-site multi-season streamflow data.

  16. Educational Validity of Business Gaming Simulation: A Research Methodology Framework

    ERIC Educational Resources Information Center

    Stainton, Andrew J.; Johnson, Johnnie E.; Borodzicz, Edward P.

    2010-01-01

    Many past educational validity studies of business gaming simulation, and more specifically total enterprise simulation, have been inconclusive. Studies have focused on the weaknesses of business gaming simulation; which is often regarded as an educational medium that has limitations regarding learning effectiveness. However, no attempts have been…

  17. A New Simulation Framework for Autonomy in Robotic Missions

    NASA Technical Reports Server (NTRS)

    Flueckiger, Lorenzo; Neukom, Christian

    2003-01-01

    Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.

  18. A Runtime Verification Framework for Control System Simulation

    SciTech Connect

    Ciraci, Selim; Fuller, Jason C.; Daily, Jeffrey A.; Makhmalbaf, Atefe; Callahan, Charles D.

    2014-08-02

    n a standard workflow for the validation of a control system, the control system is implemented as an extension to a simulator. Such simulators are complex software systems, and engineers may unknowingly violate constraints a simulator places on extensions. As such, errors may be introduced in the implementation of either the control system or the simulator leading to invalid simulation results. This paper presents a novel runtime verification approach for verifying control system implementations within simulators. The major contribution of the approach is the two-tier specification process. In the first tier, engineers model constraints using a domain-specific language tailored to modeling a controller’s response to changes in its input. The language is high-level and effectively hides the implementation details of the simulator, allowing engineers to specify design-level constraints independent of low-level simulator interfaces. In the second tier, simulator developers provide mapping rules for mapping design-level constraints to the implementation of the simulator. Using the rules, an automated tool transforms the design-level specifications into simulator-specific runtime verification specifications and generates monitoring code which is injected into the implementation of the simulator. During simulation, these monitors observe the input and output variables of the control system and report changes to the verifier. The verifier checks whether these changes follow the constraints of the control system. We describe application of this approach to the verification of the constraints of an HVAC control system implemented with the power grid simulator GridLAB-D.

  19. Agent based models for testing city evacuation strategies under a flood event as strategy to reduce flood risk

    NASA Astrophysics Data System (ADS)

    Medina, Neiler; Sanchez, Arlex; Nokolic, Igor; Vojinovic, Zoran

    2016-04-01

    This research explores the uses of Agent Based Models (ABM) and its potential to test large scale evacuation strategies in coastal cities at risk from flood events due to extreme hydro-meteorological events with the final purpose of disaster risk reduction by decreasing human's exposure to the hazard. The first part of the paper corresponds to the theory used to build the models such as: Complex adaptive systems (CAS) and the principles and uses of ABM in this field. The first section outlines the pros and cons of using AMB to test city evacuation strategies at medium and large scale. The second part of the paper focuses on the central theory used to build the ABM, specifically the psychological and behavioral model as well as the framework used in this research, specifically the PECS reference model is cover in this section. The last part of this section covers the main attributes or characteristics of human beings used to described the agents. The third part of the paper shows the methodology used to build and implement the ABM model using Repast-Symphony as an open source agent-based modelling and simulation platform. The preliminary results for the first implementation in a region of the island of Sint-Maarten a Dutch Caribbean island are presented and discussed in the fourth section of paper. The results obtained so far, are promising for a further development of the model and its implementation and testing in a full scale city

  20. NEVESIM: event-driven neural simulation framework with a Python interface.

    PubMed

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  1. NEVESIM: event-driven neural simulation framework with a Python interface

    PubMed Central

    Pecevski, Dejan; Kappel, David; Jonke, Zeno

    2014-01-01

    NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291

  2. CrusDe: A plug-in based simulation framework for composable CRUStal DEformation simulations

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.

    2008-12-01

    Within geoscience, Green's method is an established mathematical tool to analyze the dynamics of the Earth's crust in response to the application of a mass force, e.g. a surface load. Different abstractions from the Earth's interior as well as the particular effects caused by such a force are expressed by means of a Green's function, G, which is a particular solution to an inhomogeneous differential equation with boundary conditions. Surface loads, L, are defined by real data or as analytical expressions. The response of the crust to a surface load is gained by a 2D-convolution (**) of the Green's function with this load. The crustal response can be thought of as an instantaneous displacement which is followed by a gradual transition towards the final relaxed state of displacement. A relaxation function, R, describing such a transition depends on the rheological model for the ductile layer of the crust. The 1D-convolution (*) of the relaxation function with a load history, H, allows to include the temporal evolution of the surface load into a model. The product of the two convolution results expresses the displacement (rate) of the crust, U, at a certain time t: Ut = (R * H)t · (G ** L) Rather than implementing a variety of specific models, approaching crustal deformation problems from the general formulation in equation~1 opens the opportunity to consider reuse of model building blocks within a more flexible simulation framework. Model elements (Green's function, load function, etc.), operators, pre- and postprocessing, and even input and output routines could be part of a framework that enables a user to freely compose software components to resemble equation~1. The simulation framework CrusDe implements equation~1 in the proposed way. CrusDe's architecture defines interfaces for generic communication between the simulation core and the model elements. Thus, exchangeability of the particular model element implementations is possible. In the presented plug

  3. Improving Agent Based Models and Validation through Data Fusion

    PubMed Central

    Laskowski, Marek; Demianyk, Bryan C.P.; Friesen, Marcia R.; McLeod, Robert D.; Mukhi, Shamir N.

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level. PMID:23569606

  4. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  5. Agent-Based Modeling of Cancer Stem Cell Driven Solid Tumor Growth.

    PubMed

    Poleszczuk, Jan; Macklin, Paul; Enderling, Heiko

    2016-01-01

    Computational modeling of tumor growth has become an invaluable tool to simulate complex cell-cell interactions and emerging population-level dynamics. Agent-based models are commonly used to describe the behavior and interaction of individual cells in different environments. Behavioral rules can be informed and calibrated by in vitro assays, and emerging population-level dynamics may be validated with both in vitro and in vivo experiments. Here, we describe the design and implementation of a lattice-based agent-based model of cancer stem cell driven tumor growth. PMID:27044046

  6. Demeter, persephone, and the search for emergence in agent-based models.

    SciTech Connect

    North, M. J.; Howe, T. R.; Collier, N. T.; Vos, J. R.; Decision and Information Sciences; Univ. of Chicago; PantaRei Corp.; Univ. of Illinois

    2006-01-01

    In Greek mythology, the earth goddess Demeter was unable to find her daughter Persephone after Persephone was abducted by Hades, the god of the underworld. Demeter is said to have embarked on a long and frustrating, but ultimately successful, search to find her daughter. Unfortunately, long and frustrating searches are not confined to Greek mythology. In modern times, agent-based modelers often face similar troubles when searching for agents that are to be to be connected to one another and when seeking appropriate target agents while defining agent behaviors. The result is a 'search for emergence' in that many emergent or potentially emergent behaviors in agent-based models of complex adaptive systems either implicitly or explicitly require search functions. This paper considers a new nested querying approach to simplifying such agent-based modeling and multi-agent simulation search problems.

  7. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    ERIC Educational Resources Information Center

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  8. Framework for bringing realistic virtual natural environments to distributed simulations

    NASA Astrophysics Data System (ADS)

    Whitney, David A.; Reynolds, Robert A.; Olson, Stephen H.; Sherer, Dana Z.; Driscoll, Mavis L.; Watman, K. L.

    1997-06-01

    One of the major new technical challenges for distributed simulations is the distribution and presentation and distribution of the natural atmosphere-ocean-space environment. The natural terrain environment has been a part of such simulations for a while, but the integration of atmosphere and ocean data and effects is quite new. The DARPA synthetic environments (SE) program has been developing and demonstrating advanced technologies for providing tactically significant atmosphere-ocean data and effects for a range of simulations. A general-purpose data collection, assimilation, management, and distribution system is being developed by the TAOS (Total Atmosphere-Ocean System) Project. This system is designed to support the new high level architecture (HLA)/run- time infrastructure (RTI) being developed by the Defense Modeling and Simulation Office (DMSO), as well as existing distributed interactive simulation (DIS) network protocols. This paper describes how synthetic natural environments are being integrated by TAOS to provide an increasingly rich dynamic synthetic natural environment. Architectural designs and implementations to accommodate a range of simulation applications are discussed. A number of enabling technologies are employed, such as the development of standards for gridded data distribution, and the inclusion of derived products and local environmental features within 4-dimensional data grids. The application of TAOS for training, analysis, and engineering simulations for sensor analysis is discussed.

  9. Consentaneous Agent-Based and Stochastic Model of the Financial Markets

    PubMed Central

    Gontis, Vygintas; Kononovicius, Aleksejus

    2014-01-01

    We are looking for the agent-based treatment of the financial markets considering necessity to build bridges between microscopic, agent based, and macroscopic, phenomenological modeling. The acknowledgment that agent-based modeling framework, which may provide qualitative and quantitative understanding of the financial markets, is very ambiguous emphasizes the exceptional value of well defined analytically tractable agent systems. Herding as one of the behavior peculiarities considered in the behavioral finance is the main property of the agent interactions we deal with in this contribution. Looking for the consentaneous agent-based and macroscopic approach we combine two origins of the noise: exogenous one, related to the information flow, and endogenous one, arising form the complex stochastic dynamics of agents. As a result we propose a three state agent-based herding model of the financial markets. From this agent-based model we derive a set of stochastic differential equations, which describes underlying macroscopic dynamics of agent population and log price in the financial markets. The obtained solution is then subjected to the exogenous noise, which shapes instantaneous return fluctuations. We test both Gaussian and q-Gaussian noise as a source of the short term fluctuations. The resulting model of the return in the financial markets with the same set of parameters reproduces empirical probability and spectral densities of absolute return observed in New York, Warsaw and NASDAQ OMX Vilnius Stock Exchanges. Our result confirms the prevalent idea in behavioral finance that herding interactions may be dominant over agent rationality and contribute towards bubble formation. PMID:25029364

  10. Simulation framework for space environment ground test fidelity

    NASA Astrophysics Data System (ADS)

    Cline, Jason A.; Quenneville, Jason; Taylor, Ramona S.; Deschenes, Timothy; Braunstein, Matthew; Legner, Hartmut; Green, B. D.

    2013-09-01

    We present initial work to develop an extensible model for spacecraft environmental interactions. The starting point for model development is a rarefied gas dynamics model for hyperthermal atomic oxygen. The space envi- ronment produces a number of challenging stimuli, including atomic oxygen, but also charged particles, magnetic fields, spacecraft charging, ultraviolet radiation, micrometeoroids, and cryogenic temperatures. Moreover, the responses of spacecraft to combinations or sequences of these stimuli are different from their responses to single stimuli. New multi-stimulus test facilities such as the Space Threat Assessment Testbed at the USAF Arnold Engi- neering Development Complex make understanding the similarities and differences between terrestrial test and on-orbit conditions increasingly relevant. The extensible model framework under development is intended to host the variety of models needed to describe the multiphysics environment, allowing them to interact to produce a consistent unified picture. The model framework will host modules that can be validated individually or in combination.

  11. Framework Application for Core-Edge Transport Simulations

    2007-06-13

    FACETS is a whole-device model for magnetic-fusion experiments (including ITER) combining physics effects from sources & sinks, wall effects, edge effects, and core effects in an advanced parallel framework which manages allocation of parallel resources, performs runtime performance analysis, and provides tools for interactive steering and visualization. FACETS will be used by fusion researchers to design experimental campaigns, predict and model fusion experimental phenomena, and design and optimize future machines.

  12. Agent-Based Mapping of Credit Risk for Sustainable Microfinance

    PubMed Central

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk---a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital. PMID:25945790

  13. Agent-based mapping of credit risk for sustainable microfinance.

    PubMed

    Lee, Joung-Hun; Jusup, Marko; Podobnik, Boris; Iwasa, Yoh

    2015-01-01

    By drawing analogies with independent research areas, we propose an unorthodox framework for mapping microfinance credit risk--a major obstacle to the sustainability of lenders outreaching to the poor. Specifically, using the elements of network theory, we constructed an agent-based model that obeys the stylized rules of microfinance industry. We found that in a deteriorating economic environment confounded with adverse selection, a form of latent moral hazard may cause a regime shift from a high to a low loan payment probability. An after-the-fact recovery, when possible, required the economic environment to improve beyond that which led to the shift in the first place. These findings suggest a small set of measurable quantities for mapping microfinance credit risk and, consequently, for balancing the requirements to reasonably price loans and to operate on a fully self-financed basis. We illustrate how the proposed mapping works using a 10-year monthly data set from one of the best-known microfinance representatives, Grameen Bank in Bangladesh. Finally, we discuss an entirely new perspective for managing microfinance credit risk based on enticing spontaneous cooperation by building social capital. PMID:25945790

  14. Synthesized Population Databases: A US Geospatial Database for Agent-Based Models

    PubMed Central

    Wheaton, William D.; Cajka, James C.; Chasteen, Bernadette M.; Wagener, Diane K.; Cooley, Philip C.; Ganapathi, Laxminarayana; Roberts, Douglas J.; Allpress, Justine L.

    2010-01-01

    Agent-based models simulate large-scale social systems. They assign behaviors and activities to “agents” (individuals) within the population being modeled and then allow the agents to interact with the environment and each other in complex simulations. Agent-based models are frequently used to simulate infectious disease outbreaks, among other uses. RTI used and extended an iterative proportional fitting method to generate a synthesized, geospatially explicit, human agent database that represents the US population in the 50 states and the District of Columbia in the year 2000. Each agent is assigned to a household; other agents make up the household occupants. For this database, RTI developed the methods for generating synthesized households and personsassigning agents to schools and workplaces so that complex interactions among agents as they go about their daily activities can be taken into accountgenerating synthesized human agents who occupy group quarters (military bases, college dormitories, prisons, nursing homes).In this report, we describe both the methods used to generate the synthesized population database and the final data structure and data content of the database. This information will provide researchers with the information they need to use the database in developing agent-based models. Portions of the synthesized agent database are available to any user upon request. RTI will extract a portion (a county, region, or state) of the database for users who wish to use this database in their own agent-based models. PMID:20505787

  15. E-laboratories : agent-based modeling of electricity markets.

    SciTech Connect

    North, M.; Conzelmann, G.; Koritarov, V.; Macal, C.; Thimmapuram, P.; Veselka, T.

    2002-05-03

    Electricity markets are complex adaptive systems that operate under a wide range of rules that span a variety of time scales. These rules are imposed both from above by society and below by physics. Many electricity markets are undergoing or are about to undergo a transition from centrally regulated systems to decentralized markets. Furthermore, several electricity markets have recently undergone this transition with extremely unsatisfactory results, most notably in California. These high stakes transitions require the introduction of largely untested regulatory structures. Suitable laboratories that can be used to test regulatory structures before they are applied to real systems are needed. Agent-based models can provide such electronic laboratories or ''e-laboratories.'' To better understand the requirements of an electricity market e-laboratory, a live electricity market simulation was created. This experience helped to shape the development of the Electricity Market Complex Adaptive Systems (EMCAS) model. To explore EMCAS' potential as an e-laboratory, several variations of the live simulation were created. These variations probed the possible effects of changing power plant outages and price setting rules on electricity market prices.

  16. An agent-based mathematical model about carp aggregation

    NASA Astrophysics Data System (ADS)

    Liang, Yu; Wu, Chao

    2005-05-01

    This work presents an agent-based mathematical model to simulate the aggregation of carp, a harmful fish in North America. The referred mathematical model is derived from the following assumptions: (1) instead of the consensus among every carps involved in the aggregation, the aggregation of carp is completely a random and spontaneous physical behavior of numerous of independent carp; (2) carp aggregation is a collective effect of inter-carp and carp-environment interaction; (3) the inter-carp interaction can be derived from the statistical analytics about large-scale observed data. The proposed mathematical model is mainly based on empirical inter-carp force field, whose effect is featured with repulsion, parallel orientation, attraction, out-of-perception zone, and blind. Based on above mathematical model, the aggregation behavior of carp is formulated and preliminary simulation results about the aggregation of small number of carps within simple environment are provided. Further experiment-based validation about the mathematical model will be made in our future work.

  17. Abdominal surgery process modeling framework for simulation using spreadsheets.

    PubMed

    Boshkoska, Biljana Mileva; Damij, Talib; Jelenc, Franc; Damij, Nadja

    2015-08-01

    We provide a continuation of the existing Activity Table Modeling methodology with a modular spreadsheets simulation. The simulation model developed is comprised of 28 modeling elements for the abdominal surgery cycle process. The simulation of a two-week patient flow in an abdominal clinic with 75 beds demonstrates the applicability of the methodology. The simulation does not include macros, thus programming experience is not essential for replication or upgrading the model. Unlike the existing methods, the proposed solution employs a modular approach for modeling the activities that ensures better readability, the possibility of easily upgrading the model with other activities, and its easy extension and connectives with other similar models. We propose a first-in-first-served approach for simulation of servicing multiple patients. The uncertain time duration of the activities is modeled using the function "rand()". The patients movements from one activity to the next one is tracked with nested "if()" functions, thus allowing easy re-creation of the process without the need of complex programming. PMID:26004999

  18. Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking

    SciTech Connect

    Vashishta, Priya

    2014-12-01

    Reaction Dynamics in Energetic Materials: Detonation is a prototype of mechanochemistry, in which mechanically and thermally induced chemical reactions far from equilibrium exhibit vastly different behaviors. It is also one of the hardest multiscale physics problems, in which diverse length and time scales play important roles. The CACS group has performed multimillion-atom reactive MD simulations to reveal a novel two-stage reaction mechanism during the detonation of cyclotrimethylenetrinitramine (RDX) crystal. Rapid production of N2 and H2O within ~10 ps is followed by delayed production of CO molecules within ~ 1 ns. They found that further decomposition towards the final products is inhibited by the formation of large metastable C- and O-rich clusters with fractal geometry. The CACS group has also simulated the oxidation dynamics of close-packed aggregates of aluminum nanoparticles passivated by oxide shells. Their simulation results suggest an unexpectedly active role of the oxide shell as a nanoreactor.

  19. Agent-based modeling of complex infrastructures

    SciTech Connect

    North, M. J.

    2001-06-01

    Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructures and infrastructure interdependencies. The CAS model agents within the Spot Market Agent Research Tool (SMART) and Flexible Agent Simulation Toolkit (FAST) allow investigation of the electric power infrastructure, the natural gas infrastructure and their interdependencies.

  20. NPTool: a simulation and analysis framework for low-energy nuclear physics experiments

    NASA Astrophysics Data System (ADS)

    Matta, A.; Morfouace, P.; de Séréville, N.; Flavigny, F.; Labiche, M.; Shearman, R.

    2016-08-01

    The Nuclear Physics Tool (NPTool) is an open source data analysis and Monte Carlo simulation framework that has been developed for low-energy nuclear physics experiments with an emphasis on radioactive beam experiments. The NPTool offers a unified framework for designing, preparing and analyzing complex experiments employing multiple detectors, each of which may comprise some hundreds of channels. The framework has been successfully used for the analysis and simulation of experiments at facilities including GANIL, RIKEN, ALTO and TRIUMF, using both stable and radioactive beams. This paper details the NPTool philosophy together with an overview of the workflow. The framework has been benchmarked through the comparison of simulated and experimental data for a variety of detectors used in charged particle and gamma-ray spectroscopy.

  1. Maestro: an orchestration framework for large-scale WSN simulations.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  2. Maestro: An Orchestration Framework for Large-Scale WSN Simulations

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  3. NASA Earth Observing System Simulator Suite (NEOS3): A Forward Simulation Framework for Observing System Simulation Experiments

    NASA Astrophysics Data System (ADS)

    Niamsuwan, N.; Tanelli, S.; Johnson, M. P.; Jacob, J. C.; Jaruwatanadilok, S.; Oveisgharan, S.; Dao, D.; Simard, M.; Turk, F. J.; Tsang, L.; Liao, T. H.; Chau, Q.

    2014-12-01

    Future Earth observation missions will produce a large volume of interrelated data sets that will help us to cross-calibrate and validate spaceborne sensor measurements. A forward simulator is a crucial tool for examining the quality of individual products as well as resolving discrepancy among related data sets. NASA Earth Observing System Simulator Suite (NEOS3) is a highly customizable forward simulation tool for Earth remote sensing instruments. Its three-stage simulation process converts the 3D geophysical description of the scene being observed to corresponding electromagnetic emission and scattering signatures, and finally to observable parameters as reported by a (passive or active) remote sensing instrument. User-configurable options include selection of models for describing geophysical properties of atmospheric particles and their effects on the signal of interest, selection of wave scattering and propagation models, and activation of simplifying assumptions (trading between computation time and solution accuracy). The next generation of NEOS3, to be released in 2015, will feature additional state-of-the-art electromagnetic scattering models for various types of the Earth's surfaces and ground covers (e.g. layered snowpack, forest, vegetated soil, and sea ice) tailored specifically for missions like GPM and SMAP. To be included in 2015 is dedicated functionalities and interface that facilitate integrating NEOS3 into Observing System Simulation Experiment (OSSE) environments. This new generation of NEOS3 can also utilize high performance computing resources (parallel processing and cloud computing) and can be scaled to handle large or computation intensive problems. This presentation will highlight some notable features of NEOS3. Demonstration of its applications for evaluating new mission concepts, especially in the context of OSSE frameworks will also be presented.

  4. Designing a Virtual Olympic Games Framework by Using Simulation in Web 2.0 Technologies

    ERIC Educational Resources Information Center

    Stoilescu, Dorian

    2013-01-01

    Instructional simulation had major difficulties in the past for offering limited possibilities in practice and learning. This article proposes a link between instructional simulation and Web 2.0 technologies. More exactly, I present the design of the Virtual Olympic Games Framework (VOGF), as a significant demonstration of how interactivity in…

  5. NISAC Agent Based Laboratory for Economics

    SciTech Connect

    Downes, Paula; Davis, Chris; Eidson, Eric; Ehlen, Mark; Gieseler, Charles; Harris, Richard

    2006-10-11

    The software provides large-scale microeconomic simulation of complex economic and social systems (such as supply chain and market dynamics of businesses in the US economy) and their dependence on physical infrastructure systems. The system is based on Agent simulation, where each entity of inteest in the system to be modeled (for example, a Bank, individual firms, Consumer households, etc.) is specified in a data-driven sense to be individually repreented by an Agent. The Agents interact using rules of interaction appropriate to their roles, and through those interactions complex economic and social dynamics emerge. The software is implemented in three tiers, a Java-based visualization client, a C++ control mid-tier, and a C++ computational tier.

  6. NISAC Agent Based Laboratory for Economics

    2006-10-11

    The software provides large-scale microeconomic simulation of complex economic and social systems (such as supply chain and market dynamics of businesses in the US economy) and their dependence on physical infrastructure systems. The system is based on Agent simulation, where each entity of inteest in the system to be modeled (for example, a Bank, individual firms, Consumer households, etc.) is specified in a data-driven sense to be individually repreented by an Agent. The Agents interactmore » using rules of interaction appropriate to their roles, and through those interactions complex economic and social dynamics emerge. The software is implemented in three tiers, a Java-based visualization client, a C++ control mid-tier, and a C++ computational tier.« less

  7. A framework for service enterprise workflow simulation with multi-agents cooperation

    NASA Astrophysics Data System (ADS)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  8. Adsorption of selected gases on metal-organic frameworks and covalent organic frameworks: A comparative grand canonical Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Wang, Lili; Wang, Lu; Zhao, Jijun; Yan, Tianying

    2012-06-01

    The adsorption properties of H2, CO, NO, and NO2 in several typical nanoporous materials (covalent organic framework (COF)-105, COF-108, metal-organic framework (MOF)-5, and MOF-177) at 298 K were investigated by grand canonical Monte Carlo simulations. Good agreement between simulated results and experimental data has been achieved for H2 adsorption on MOF-5 and MOF-177, indicating the reliability of the theoretical approach. The simulated adsorption isotherms for these four gases show analogical trend, i.e., increasing nearly linearly with pressure. Among the four host materials, COF-108 exhibits the highest hydrogen uptake (˜0.89 wt. % at 100 bars) owing to its low densities and high surface area. The adsorption amounts of NO2 in these materials are higher than those of the other three gases because of the stronger gas-sorbent interaction. In particular, NO2 adsorption amount in MOF-177 can reach as high as 10.7 mmol/g at 298 K and 10 bars. The interaction between the four gases (H2, CO, NO, and NO2) and the COF/MOF adsorbents is further discussed in terms of the isosteric heat.

  9. Atomistic Simulation of Protein Encapsulation in Metal-Organic Frameworks.

    PubMed

    Zhang, Haiyang; Lv, Yongqin; Tan, Tianwei; van der Spoel, David

    2016-01-28

    Fabrication of metal-organic frameworks (MOFs) with large apertures triggers a brand-new research area for selective encapsulation of biomolecules within MOF nanopores. The underlying inclusion mechanism is yet to be clarified however. Here we report a molecular dynamics study on the mechanism of protein encapsulation in MOFs. Evaluation for the binding of amino acid side chain analogues reveals that van der Waals interaction is the main driving force for the binding and that guest size acts as a key factor predicting protein binding with MOFs. Analysis on the conformation and thermodynamic stability of the miniprotein Trp-cage encapsulated in a series of MOFs with varying pore apertures and surface chemistries indicates that protein encapsulation can be achieved via maintaining a polar/nonpolar balance in the MOF surface through tunable modification of organic linkers and Mg-O chelating moieties. Such modifications endow MOFs with a more biocompatible confinement. This work provides guidelines for selective inclusion of biomolecules within MOFs and facilitates MOF functions as a new class of host materials and molecular chaperones. PMID:26730607

  10. Measure of Landscape Heterogeneity by Agent-Based Methodology

    NASA Astrophysics Data System (ADS)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  11. Agent-based Transaction management for Mobile Multidatabase

    SciTech Connect

    Ongtang, Machigar; Hurson, Ali R.; Jiao, Yu; Potok, Thomas E

    2007-01-01

    The requirements to access and manipulate data across multiple heterogeneous existing databases and the proliferation of mobile technologies have propelled the development of mobile multidatabase system (MDBS). In that environment, transaction management is not a trivial task due to the technological constraints. Agent technology is an evolving research area, which has been applied to several application domains. This paper proposes an Agent-based Transaction Management for Mobile Multidatabase (AT3M) system. AT3M applies static and mobile agents to manage the transaction processing in mobile multidatabase system. It enables a fully distributed transaction management, accommodates mobility of the mobile clients, and allows global subtransactions to process in parallel. The proposed algorithm utilizes the hierarchical meta data structure of Summary Schema Model (SSM) which captures semantic information of data objects in the underlying local databases at different levels of abstractions. It is shown by simulation that AT3M suits well in mobile multidatabase environment and outperforms the existing V-Locking algorithm designed for the same environment in many aspects.

  12. Evaluating Water Demand Using Agent-Based Modeling

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.

    2004-12-01

    The supply and demand of water resources are functions of complex, inter-related systems including hydrology, climate, demographics, economics, and policy. To assess the safety and sustainability of water resources, planners often rely on complex numerical models that relate some or all of these systems using mathematical abstractions. The accuracy of these models relies on how well the abstractions capture the true nature of the systems interactions. Typically, these abstractions are based on analyses of observations and/or experiments that account only for the statistical mean behavior of each system. This limits the approach in two important ways: 1) It cannot capture cross-system disruptive events, such as major drought, significant policy change, or terrorist attack, and 2) it cannot resolve sub-system level responses. To overcome these limitations, we are developing an agent-based water resources model that includes the systems of hydrology, climate, demographics, economics, and policy, to examine water demand during normal and extraordinary conditions. Agent-based modeling (ABM) develops functional relationships between systems by modeling the interaction between individuals (agents), who behave according to a probabilistic set of rules. ABM is a "bottom-up" modeling approach in that it defines macro-system behavior by modeling the micro-behavior of individual agents. While each agent's behavior is often simple and predictable, the aggregate behavior of all agents in each system can be complex, unpredictable, and different than behaviors observed in mean-behavior models. Furthermore, the ABM approach creates a virtual laboratory where the effects of policy changes and/or extraordinary events can be simulated. Our model, which is based on the demographics and hydrology of the Middle Rio Grande Basin in the state of New Mexico, includes agent groups of residential, agricultural, and industrial users. Each agent within each group determines its water usage

  13. Agent Based Intelligence in a Tetrahedral Rover

    NASA Technical Reports Server (NTRS)

    Phelps, Peter; Truszkowski, Walt

    2007-01-01

    A tetrahedron is a 4-node 6-strut pyramid structure which is being used by the NASA - Goddard Space Flight Center as the basic building block for a new approach to robotic motion. The struts are extendable; it is by the sequence of activities: strut-extension, changing the center of gravity and falling that the tetrahedron "moves". Currently, strut-extension is handled by human remote control. There is an effort underway to make the movement of the tetrahedron autonomous, driven by an attempt to achieve a goal. The approach being taken is to associate an intelligent agent with each node. Thus, the autonomous tetrahedron is realized as a constrained multi-agent system, where the constraints arise from the fact that between any two agents there is an extendible strut. The hypothesis of this work is that, by proper composition of such automated tetrahedra, robotic structures of various levels of complexity can be developed which will support more complex dynamic motions. This is the basis of the new approach to robotic motion which is under investigation. A Java-based simulator for the single tetrahedron, realized as a constrained multi-agent system, has been developed and evaluated. This paper reports on this project and presents a discussion of the structure and dynamics of the simulator.

  14. Consistent and conservative framework for incompressible multiphase flow simulations

    NASA Astrophysics Data System (ADS)

    Owkes, Mark; Desjardins, Olivier

    2015-11-01

    We present a computational methodology for convection that handles discontinuities with second order accuracy and maintains conservation to machine precision. We use this method in the context of an incompressible gas-liquid flow to transport the phase interface, momentum, and scalars. Using the same methodology for all the variables ensures discretely consistent transport, which is necessary for robust and accurate simulations of turbulent atomizing flows with high-density ratios. The method achieves conservative transport by computing consistent fluxes on a refined mesh, which ensures all conserved quantities are fluxed with the same discretization. Additionally, the method seamlessly couples semi-Lagrangian fluxes used near the interface with finite difference fluxes used away from the interface. The semi-Lagrangian fluxes are three-dimensional, un-split, and conservatively handle discontinuities. Careful construction of the fluxes ensures they are divergence-free and no gaps or overlaps form between neighbors. We have tested and used the scheme for many cases and demonstrate a simulation of an atomizing liquid jet.

  15. Turbulent Simulations of Divertor Detachment Based On BOUT + + Framework

    NASA Astrophysics Data System (ADS)

    Chen, Bin; Xu, Xueqiao; Xia, Tianyang; Ye, Minyou

    2015-11-01

    China Fusion Engineering Testing Reactor is under conceptual design, acting as a bridge between ITER and DEMO. The detached divertor operation offers great promise for a reduction of heat flux onto divertor target plates for acceptable erosion. Therefore, a density scan is performed via an increase of D2 gas puffing rates in the range of 0 . 0 ~ 5 . 0 ×1023s-1 by using the B2-Eirene/SOLPS 5.0 code package to study the heat flux control and impurity screening property. As the density increases, it shows a gradually change of the divertor operation status, from low-recycling regime to high-recycling regime and finally to detachment. Significant radiation loss inside the confined plasma in the divertor region during detachment leads to strong parallel density and temperature gradients. Based on the SOLPS simulations, BOUT + + simulations will be presented to investigate the stability and turbulent transport under divertor plasma detachment, particularly the strong parallel gradient driven instabilities and enhanced plasma turbulence to spread heat flux over larger surface areas. The correlation between outer mid-plane and divertor turbulence and the related transport will be analyzed. Prepared by LLNL under Contract DE-AC52-07NA27344. LLNL-ABS-675075.

  16. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  17. Agent-Based Modeling of Growth Processes

    ERIC Educational Resources Information Center

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  18. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    NASA Astrophysics Data System (ADS)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  19. Creating a Software Framework for Simulating Satellite Geolocation

    SciTech Connect

    Koch, Daniel B

    2011-01-01

    It is hard to imagine life these days without having some sort of electronic indication of one's current location. Whether the purpose is for business, personal, or emergency use, utilizing smart cell phones, in-vehicle navigation systems, or location beacons, dependence on the Global Positioning System (GPS) is pervasive. Yet the availability of the GPS should not be taken for granted. Both environmental (e.g., terrain, weather) and intentional interference (i.e., jamming) can reduce or deny satellite access. In order to investigate these and other issues, as well as to explore possible alternative satellite constellations, an application called the Satellite Simulation Toolkit (SatSim) was created. This paper presents a high-level overview of SatSim and an example of how it may be used to study geolocation.

  20. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  1. Using a scalable modeling and simulation framework to evaluate the benefits of intelligent transportation systems.

    SciTech Connect

    Ewing, T.; Tentner, A.

    2000-03-21

    A scalable, distributed modeling and simulation framework has been developed at Argonne National Laboratory to study Intelligent Transportation Systems. The framework can run on a single-processor workstation, or run distributed on a multiprocessor computer or network of workstations. The framework is modular and supports plug-in models, hardware, and live data sources. The initial set of models currently includes road network and traffic flow, probe and smart vehicles, traffic management centers, communications between vehicles and centers, in-vehicle navigation systems, roadway traffic advisories. The modeling and simulation capability has been used to examine proposed ITS concepts. Results are presented from modeling scenarios from the Advanced Driver and Vehicle Advisory Navigation Concept (ADVANCE) experimental program to demonstrate how the framework can be used to evaluate the benefits of ITS and to plan future ITS operational tests and deployment initiatives.

  2. A Collective Case Study of Secondary Students' Model-Based Inquiry on Natural Selection through Programming in an Agent-Based Modeling Environment

    ERIC Educational Resources Information Center

    Xiang, Lin

    2011-01-01

    This is a collective case study seeking to develop detailed descriptions of how programming an agent-based simulation influences a group of 8th grade students' model-based inquiry (MBI) by examining students' agent-based programmable modeling (ABPM) processes and the learning outcomes. The context of the present study was a biology unit on…

  3. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  4. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The appendices to the original report are contained in this document.

  5. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 1

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2010-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The findings of the assessment are contained in this report.

  6. Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2011-01-01

    The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.

  7. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  8. Large-eddy simulation in an anelastic framework with closed water and entropy balances

    NASA Astrophysics Data System (ADS)

    Pressel, Kyle G.; Kaul, Colleen M.; Schneider, Tapio; Tan, Zhihong; Mishra, Siddhartha

    2015-09-01

    A large-eddy simulation (LES) framework is developed for simulating the dynamics of clouds and boundary layers with closed water and entropy balances. The framework is based on the anelastic equations in a formulation that remains accurate for deep convection. As prognostic variables, it uses total water and entropy, which are conserved in adiabatic and reversible processes, including reversible phase changes of water. This has numerical advantages for modeling clouds, in which reversible phase changes of water occur frequently. The equations of motion are discretized using higher-order weighted essentially nonoscillatory (WENO) discretization schemes with strong stability preserving time stepping. Numerical tests demonstrate that the WENO schemes yield simulations superior to centered schemes, even when the WENO schemes are used at coarser resolution. The framework is implemented in a new LES code written in Python and Cython, which makes the code transparent and easy to use for a wide user group.

  9. SIMPEG: An open source framework for simulation and gradient based parameter estimation in geophysical applications

    NASA Astrophysics Data System (ADS)

    Cockett, Rowan; Kang, Seogi; Heagy, Lindsey J.; Pidlisecky, Adam; Oldenburg, Douglas W.

    2015-12-01

    Inverse modeling is a powerful tool for extracting information about the subsurface from geophysical data. Geophysical inverse problems are inherently multidisciplinary, requiring elements from the relevant physics, numerical simulation, and optimization, as well as knowledge of the geologic setting, and a comprehension of the interplay between all of these elements. The development and advancement of inversion methodologies can be enabled by a framework that supports experimentation, is flexible and extensible, and allows the knowledge generated to be captured and shared. The goal of this paper is to propose a framework that supports many different types of geophysical forward simulations and deterministic inverse problems. Additionally, we provide an open source implementation of this framework in Python called SIMPEG (Simulation and Parameter Estimation in Geophysics,

  10. An Implicit Solution Framework for Reactor Fuel Performance Simulation

    SciTech Connect

    Glen Hansen; Chris Newman; Derek Gaston; Cody Permann

    2009-08-01

    The simulation of nuclear reactor fuel performance involves complex thermomechanical processes between fuel pellets, made of fissile material, and the protective cladding that surrounds the pellets. An important design goal for a fuel is to maximize the life of the cladding thereby allowing the fuel to remain in the reactor for a longer period of time to achieve higher degrees of burnup. This presentation presents an initial approach for modeling the thermomechanical response of reactor fuel, and details of the solution method employed within INL's fuel performance code, BISON. The code employs advanced methods for solving coupled partial differential equation systems that describe multidimensional fuel thermomechanics, heat generation, and oxygen transport within the fuel. This discussion explores the effectiveness of a JFNK-based solution of a problem involving three dimensional fully coupled, nonlinear transient heat conduction and that includes pellet displacement and oxygen diffusion effects. These equations are closed using empirical data that is a function of temperature, density, and oxygen hyperstoichiometry. The method appears quite effective for the fuel pellet / cladding configurations examined, with excellent nonlinear convergence properties exhibited on the combined system. In closing, fully coupled solutions of three dimensional thermomechanics coupled with oxygen diffusion appear quite attractive using the JFNK approach described here, at least for configurations similar to those examined in this report.

  11. Accounting for Diffusion in Agent Based Models of Reaction-Diffusion Systems with Application to Cytoskeletal Diffusion

    PubMed Central

    Azimi, Mohammad; Jamali, Yousef; Mofrad, Mohammad R. K.

    2011-01-01

    Diffusion plays a key role in many biochemical reaction systems seen in nature. Scenarios where diffusion behavior is critical can be seen in the cell and subcellular compartments where molecular crowding limits the interaction between particles. We investigate the application of a computational method for modeling the diffusion of molecules and macromolecules in three-dimensional solutions using agent based modeling. This method allows for realistic modeling of a system of particles with different properties such as size, diffusion coefficients, and affinity as well as the environment properties such as viscosity and geometry. Simulations using these movement probabilities yield behavior that mimics natural diffusion. Using this modeling framework, we simulate the effects of molecular crowding on effective diffusion and have validated the results of our model using Langevin dynamics simulations and note that they are in good agreement with previous experimental data. Furthermore, we investigate an extension of this framework where single discrete cells can contain multiple particles of varying size in an effort to highlight errors that can arise from discretization that lead to the unnatural behavior of particles undergoing diffusion. Subsequently, we explore various algorithms that differ in how they handle the movement of multiple particles per cell and suggest an algorithm that properly accommodates multiple particles of various sizes per cell that can replicate the natural behavior of these particles diffusing. Finally, we use the present modeling framework to investigate the effect of structural geometry on the directionality of diffusion in the cell cytoskeleton with the observation that parallel orientation in the structural geometry of actin filaments of filopodia and the branched structure of lamellipodia can give directionality to diffusion at the filopodia-lamellipodia interface. PMID:21966493

  12. Understanding Group/Party Affiliation Using Social Networks and Agent-Based Modeling

    NASA Technical Reports Server (NTRS)

    Campbell, Kenyth

    2012-01-01

    The dynamics of group affiliation and group dispersion is a concept that is most often studied in order for political candidates to better understand the most efficient way to conduct their campaigns. While political campaigning in the United States is a very hot topic that most politicians analyze and study, the concept of group/party affiliation presents its own area of study that producers very interesting results. One tool for examining party affiliation on a large scale is agent-based modeling (ABM), a paradigm in the modeling and simulation (M&S) field perfectly suited for aggregating individual behaviors to observe large swaths of a population. For this study agent based modeling was used in order to look at a community of agents and determine what factors can affect the group/party affiliation patterns that are present. In the agent-based model that was used for this experiment many factors were present but two main factors were used to determine the results. The results of this study show that it is possible to use agent-based modeling to explore group/party affiliation and construct a model that can mimic real world events. More importantly, the model in the study allows for the results found in a smaller community to be translated into larger experiments to determine if the results will remain present on a much larger scale.

  13. Ensuring Congruency in Multiscale Modeling: Towards Linking Agent Based and Continuum Biomechanical Models of Arterial Adaptation

    PubMed Central

    Hayenga, Heather N.; Thorne, Bryan C.; Peirce, Shayn M.; Humphrey, Jay D.

    2011-01-01

    There is a need to develop multiscale models of vascular adaptations to understand tissue level manifestations of cellular level mechanisms. Continuum based biomechanical models are well suited for relating blood pressures and flows to stress-mediated changes in geometry and properties, but less so for describing underlying mechanobiological processes. Discrete stochastic agent based models are well suited for representing biological processes at a cellular level, but not for describing tissue level mechanical changes. We present here a conceptually new approach to facilitate the coupling of continuum and agent based models. Because of ubiquitous limitations in both the tissue- and cell-level data from which one derives constitutive relations for continuum models and rule-sets for agent based models, we suggest that model verification should enforce congruency across scales. That is, multiscale model parameters initially determined from data sets representing different scales should be refined, when possible, to ensure that common outputs are consistent. Potential advantages of this approach are illustrated by comparing simulated aortic responses to a sustained increase in blood pressure predicted by continuum and agent based models both before and after instituting a genetic algorithm to refine 16 objectively bounded model parameters. We show that congruency-based parameter refinement not only yielded increased consistency across scales, it also yielded predictions that are closer to in vivo observations. PMID:21809144

  14. Agent-Based vs. Equation-based Epidemiological Models:A Model Selection Case Study

    SciTech Connect

    Sukumar, Sreenivas R; Nutaro, James J

    2012-01-01

    This paper is motivated by the need to design model validation strategies for epidemiological disease-spread models. We consider both agent-based and equation-based models of pandemic disease spread and study the nuances and complexities one has to consider from the perspective of model validation. For this purpose, we instantiate an equation based model and an agent based model of the 1918 Spanish flu and we leverage data published in the literature for our case- study. We present our observations from the perspective of each implementation and discuss the application of model-selection criteria to compare the risk in choosing one modeling paradigm to another. We conclude with a discussion of our experience and document future ideas for a model validation framework.

  15. A Unified Simulation Framework for Megathrust Rupture Dynamics and Tsunamis

    NASA Astrophysics Data System (ADS)

    Dunham, E. M.; Lotto, G. C.; Kozdon, J. E.

    2014-12-01

    Many earthquakes, including megathrust events in subduction zones, occur offshore. In addition to seismic waves, such earthquakes also generate tsunamis. We present a methodology for simultaneously investigating earthquake rupture dynamics and tsunamigenesis, based on solution of the elastic and acoustic wave equations, in the solid and fluid portions of the domain, respectively. Surface gravity waves or tsunamis emerge naturally in such a description when gravitational restoring forces are properly taken into account. In our approach, we adopt an Eulerian description of the ocean and within it solve for particle velocities and the perturbation in pressure, Δp, about an initial hydrostatic state. The key step is enforcing the traction-free boundary condition on the moving ocean surface. We linearize this boundary condition, in order to apply it on the initial surface, and express it as Δp-ρgη=0, where -ρg is the initial hydrostatic gradient in pressure and η is the sea surface uplift (obtained, to first order, by integrating vertical particle velocity on the initial ocean surface). We show that this is the only place one needs to account for gravity. Additional terms in the momentum balance and linearized equation of state describing advection of pressure and density gradients can be included to study internal gravity waves within the ocean, but these can be safely neglected for problems of interest to us. We present a range of simulations employing this new methodology. These include test problems used to verify the accuracy of the method for modeling seismic, ocean acoustic, and tsunami waves, as well as more detailed models of megathrust ruptures. Our present work is focused on tsunami generation in models with variable bathymetry, where previous studies have raised questions regarding how horizontal displacement of a sloping seafloor excites tsunamis. Our approach rigorously accounts for time-dependent seafloor motion, horizontal momentum transfer, and

  16. Fluctuation complexity of agent-based financial time series model by stochastic Potts system

    NASA Astrophysics Data System (ADS)

    Hong, Weijia; Wang, Jun

    2015-03-01

    Financial market is a complex evolved dynamic system with high volatilities and noises, and the modeling and analyzing of financial time series are regarded as the rather challenging tasks in financial research. In this work, by applying the Potts dynamic system, a random agent-based financial time series model is developed in an attempt to uncover the empirical laws in finance, where the Potts model is introduced to imitate the trading interactions among the investing agents. Based on the computer simulation in conjunction with the statistical analysis and the nonlinear analysis, we present numerical research to investigate the fluctuation behaviors of the proposed time series model. Furthermore, in order to get a robust conclusion, we consider the daily returns of Shanghai Composite Index and Shenzhen Component Index, and the comparison analysis of return behaviors between the simulation data and the actual data is exhibited.

  17. Lipid-converter, a framework for lipid manipulations in molecular dynamics simulations

    PubMed Central

    Larsson, Per; Kasson, Peter M.

    2014-01-01

    Construction of lipid membrane and membrane protein systems for molecular dynamics simulations can be a challenging process. In addition, there are few available tools to extend existing studies by repeating simulations using other force fields and lipid compositions. To facilitate this, we introduce lipidconverter, a modular Python framework for exchanging force fields and lipid composition in coordinate files obtained from simulations. Force fields and lipids are specified by simple text files, making it easy to introduce support for additional force fields and lipids. The converter produces simulation input files that can be used for structural relaxation of the new membranes. PMID:25081234

  18. EFFIS: and End-to-end Framework for Fusion Integrated Simulation

    SciTech Connect

    Cummings, Julian; Schwan, Karsten; Sim, Alexander S; Shoshani, Arie; Docan, Ciprian; Parashar, Manish; Klasky, Scott A; Podhorszki, Norbert

    2010-01-01

    The purpose of the Fusion Simulation Project is to develop a predictive capability for integrated modeling of magnetically confined burning plasmas. In support of this mission, the Center for Plasma Edge Simulation has developed an End-to-end Framework for Fusion Integrated Simulation (EFFIS) that combines critical computer science technologies in an effective manner to support leadership class computing and the coupling of complex plasma physics models. We describe here the main components of EFFIS and how they are being utilized to address our goal of integrated predictive plasma edge simulation.

  19. FERN – a Java framework for stochastic simulation and evaluation of reaction networks

    PubMed Central

    Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf

    2008-01-01

    Background Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. Results In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. Conclusion FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand

  20. FNCS: A Framework for Power System and Communication Networks Co-Simulation

    SciTech Connect

    Ciraci, Selim; Daily, Jeffrey A.; Fuller, Jason C.; Fisher, Andrew R.; Marinovici, Laurentiu D.; Agarwal, Khushbu

    2014-04-13

    This paper describes the Fenix framework that uses a federated approach for integrating power grid and communication network simulators. Compared existing approaches, Fenix al- lows co-simulation of both transmission and distribution level power grid simulators with the communication network sim- ulator. To reduce the performance overhead of time synchro- nization, Fenix utilizes optimistic synchronization strategies that make speculative decisions about when the simulators are going to exchange messages. GridLAB-D (a distribution simulator), PowerFlow (a transmission simulator), and ns-3 (a telecommunication simulator) are integrated with the frame- work and are used to illustrate the enhanced performance pro- vided by speculative multi-threading on a smart grid applica- tion. Our speculative multi-threading approach achieved on average 20% improvement over the existing synchronization methods

  1. Agent-based model for rural-urban migration: A dynamic consideration

    NASA Astrophysics Data System (ADS)

    Cai, Ning; Ma, Hai-Ying; Khan, M. Junaid

    2015-10-01

    This paper develops a dynamic agent-based model for rural-urban migration, based on the previous relevant works. The model conforms to the typical dynamic linear multi-agent systems model concerned extensively in systems science, in which the communication network is formulated as a digraph. Simulations reveal that consensus of certain variable could be harmful to the overall stability and should be avoided.

  2. Agent Based Study of Surprise Attacks:. Roles of Surveillance, Prompt Reaction and Intelligence

    NASA Astrophysics Data System (ADS)

    Shanahan, Linda; Sen, Surajit

    Defending a confined territory from a surprise attack is seldom possible. We use molecular dynamics and statistical physics inspired agent-based simulations to explore the evolution and outcome of such attacks. The study suggests robust emergent behavior, which emphasizes the importance of accurate surveillance, automated and powerful attack response, building layout, and sheds light on the role of communication restrictions in defending such territories.

  3. A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition

    NASA Technical Reports Server (NTRS)

    Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.

    2012-01-01

    A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.

  4. Simulating the Household Plug-in Hybrid Electric Vehicle Distribution and its Electric Distribution Network Impacts

    SciTech Connect

    Cui, Xiaohui; Kim, Hoe Kyoung; Liu, Cheng; Kao, Shih-Chieh; Bhaduri, Budhendra L

    2012-01-01

    This paper presents a multi agent-based simulation framework for modeling spatial distribution of plug-in hybrid electric vehicle ownership at local residential level, discovering plug-in hybrid electric vehicle hot zones where ownership may quickly increase in the near future, and estimating the impacts of the increasing plug-in hybrid electric vehicle ownership on the local electric distribution network with different charging strategies. We use Knox County, Tennessee as a case study to highlight the simulation results of the agent-based simulation framework.

  5. A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations

    SciTech Connect

    Nomura, K; Seymour, R; Wang, W; Kalia, R; Nakano, A; Vashishta, P; Shimojo, F; Yang, L H

    2009-02-17

    A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based on hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops {center_dot} day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).

  6. Framework of passive millimeter-wave scene simulation based on material classification

    NASA Astrophysics Data System (ADS)

    Park, Hyuk; Kim, Sung-Hyun; Lee, Ho-Jin; Kim, Yong-Hoon; Ki, Jae-Sug; Yoon, In-Bok; Lee, Jung-Min; Park, Soon-Jun

    2006-05-01

    Over the past few decades, passive millimeter-wave (PMMW) sensors have emerged as useful implements in transportation and military applications such as autonomous flight-landing system, smart weapons, night- and all weather vision system. As an efficient way to predict the performance of a PMMW sensor and apply it to system, it is required to test in SoftWare-In-the-Loop (SWIL). The PMMW scene simulation is a key component for implementation of this simulator. However, there is no commercial on-the-shelf available to construct the PMMW scene simulation; only there have been a few studies on this technology. We have studied the PMMW scene simulation method to develop the PMMW sensor SWIL simulator. This paper describes the framework of the PMMW scene simulation and the tentative results. The purpose of the PMMW scene simulation is to generate sensor outputs (or image) from a visible image and environmental conditions. We organize it into four parts; material classification mapping, PMMW environmental setting, PMMW scene forming, and millimeter-wave (MMW) sensorworks. The background and the objects in the scene are classified based on properties related with MMW radiation and reflectivity. The environmental setting part calculates the following PMMW phenomenology; atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Then, PMMW raw images are formed with surface geometry. Finally, PMMW sensor outputs are generated from PMMW raw images by applying the sensor characteristics such as an aperture size and noise level. Through the simulation process, PMMW phenomenology and sensor characteristics are simulated on the output scene. We have finished the design of framework of the simulator, and are working on implementation in detail. As a tentative result, the flight observation was simulated in specific conditions. After implementation details, we plan to increase the reliability of the simulation by data collecting

  7. Task parallel sensitivity analysis and parameter estimation of groundwater simulations through the SALSSA framework

    SciTech Connect

    Schuchardt, Karen L.; Agarwal, Khushbu; Chase, Jared M.; Rockhold, Mark L.; Freedman, Vicky L.; Elsethagen, Todd O.; Scheibe, Timothy D.; Chin, George; Sivaramakrishnan, Chandrika

    2010-07-15

    The Support Architecture for Large-Scale Subsurface Analysis (SALSSA) provides an extensible framework, sophisticated graphical user interface, and underlying data management system that simplifies the process of running subsurface models, tracking provenance information, and analyzing the model results. Initially, SALSSA supported two styles of job control: user directed execution and monitoring of individual jobs, and load balancing of jobs across multiple machines taking advantage of many available workstations. Recent efforts in subsurface modelling have been directed at advancing simulators to take advantage of leadership class supercomputers. We describe two approaches, current progress, and plans toward enabling efficient application of the subsurface simulator codes via the SALSSA framework: automating sensitivity analysis problems through task parallelism, and task parallel parameter estimation using the PEST framework.

  8. A Semantic Web Service and Simulation Framework to Intelligent Distributed Manufacturing

    SciTech Connect

    Son, Young Jun; Kulvatunyou, Boonserm; Cho, Hyunbo; Feng, Shaw

    2005-11-01

    To cope with today's fluctuating markets, a virtual enterprise (VE) concept can be employed to achieve the cooperation among independently operating enterprises. The success of VE depends on reliable interoperation among trading partners. This paper proposes a framework based on semantic web of manufacturing and simulation services to enable business and engineering collaborations between VE partners, particularly a design house and manufacturing suppliers.

  9. A Framework for End to End Simulations of the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Gibson, R. R.; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A. J.; Chang, C.; Gilmore, K.; Grace, E.; Hannel, M.; Jernigan, J. G.; Jones, L.; Kahn, S. M.; Krughoff, K. S.; Lorenz, S.; Marshall, S.; Nagarajan, S.; Peterson, J. R.; Pizagno, J.; Rasmussen, A. P.; Shmakova, M.; Silvestri, N.; Todd, N.; Young, M.

    2011-07-01

    As observatories get bigger and more complicated to operate, risk mitigation techniques become increasingly important. Additionally, the size and complexity of data coming from the next generation of surveys will present enormous challenges in how we process, store, and analyze these data. End-to-end simulations of telescopes with the scope of LSST are essential to correct problems and verify science capabilities as early as possible. A simulator can also determine how defects and trade-offs in individual subsystems impact the overall design requirements. Here, we present the architecture, implementation, and results of the source simulation framework for the Large Synoptic Survey Telescope (LSST). The framework creates time-based realizations of astronomical objects and formats the output for use in many different survey contexts (i.e., image simulation, reference catalogs, calibration catalogs, and simulated science outputs). The simulations include Milky Way, cosmological, and solar system models as well as transient and variable objects. All model objects can be sampled with the LSST cadence from any operations simulator run. The result is a representative, full-sky simulation of LSST data that can be used to determine telescope performance, the feasibility of science goals, and strategies for processing LSST-scale data volumes.

  10. Numerical simulation of the fracture process in ceramic FPD frameworks caused by oblique loading.

    PubMed

    Kou, Wen; Qiao, Jiyan; Chen, Li; Ding, Yansheng; Sjögren, Göran

    2015-10-01

    Using a newly developed three-dimensional (3D) numerical modeling code, an analysis was performed of the fracture behavior in a three-unit ceramic-based fixed partial denture (FPD) framework subjected to oblique loading. All the materials in the study were treated heterogeneously; Weibull׳s distribution law was applied to the description of the heterogeneity. The Mohr-Coulomb failure criterion with tensile strength cut-off was utilized in judging whether the material was in an elastic or failed state. The simulated loading area was placed either on the buccal or the lingual cusp of a premolar-shaped pontic with the loading direction at 30°, 45°, 60°, 75° or 90° angles to the occlusal surface. The stress distribution, fracture initiation and propagation in the framework during the loading and fracture process were analyzed. This numerical simulation allowed the cause of the framework fracture to be identified as tensile stress failure. The decisive fracture was initiated in the gingival embrasure of the pontic, regardless of whether the buccal or lingual cusp of the pontic was loaded. The stress distribution and fracture propagation process of the framework could be followed step by step from beginning to end. The bearing capacity and the rigidity of the framework vary with the loading position and direction. The framework loaded with 90° towards the occlusal surface has the highest bearing capacity and the greatest rigidity. The framework loaded with 30° towards the occlusal surface has the least rigidity indicating that oblique loading has a major impact on the fracture of ceramic frameworks. PMID:26143353

  11. Pain expressiveness and altruistic behavior: an exploration using agent-based modeling.

    PubMed

    de C Williams, Amanda C; Gallagher, Elizabeth; Fidalgo, Antonio R; Bentley, Peter J

    2016-03-01

    Predictions which invoke evolutionary mechanisms are hard to test. Agent-based modeling in artificial life offers a way to simulate behaviors and interactions in specific physical or social environments over many generations. The outcomes have implications for understanding adaptive value of behaviors in context. Pain-related behavior in animals is communicated to other animals that might protect or help, or might exploit or predate. An agent-based model simulated the effects of displaying or not displaying pain (expresser/nonexpresser strategies) when injured and of helping, ignoring, or exploiting another in pain (altruistic/nonaltruistic/selfish strategies). Agents modeled in MATLAB interacted at random while foraging (gaining energy); random injury interrupted foraging for a fixed time unless help from an altruistic agent, who paid an energy cost, speeded recovery. Environmental and social conditions also varied, and each model ran for 10,000 iterations. Findings were meaningful in that, in general, contingencies that evident from experimental work with a variety of mammals, over a few interactions, were replicated in the agent-based model after selection pressure over many generations. More energy-demanding expression of pain reduced its frequency in successive generations, and increasing injury frequency resulted in fewer expressers and altruists. Allowing exploitation of injured agents decreased expression of pain to near zero, but altruists remained. Decreasing costs or increasing benefits of helping hardly changed its frequency, whereas increasing interaction rate between injured agents and helpers diminished the benefits to both. Agent-based modeling allows simulation of complex behaviors and environmental pressures over evolutionary time. PMID:26655734

  12. Pain expressiveness and altruistic behavior: an exploration using agent-based modeling

    PubMed Central

    de C Williams, Amanda C.; Gallagher, Elizabeth; Fidalgo, Antonio R.; Bentley, Peter J.

    2015-01-01

    Abstract Predictions which invoke evolutionary mechanisms are hard to test. Agent-based modeling in artificial life offers a way to simulate behaviors and interactions in specific physical or social environments over many generations. The outcomes have implications for understanding adaptive value of behaviors in context. Pain-related behavior in animals is communicated to other animals that might protect or help, or might exploit or predate. An agent-based model simulated the effects of displaying or not displaying pain (expresser/nonexpresser strategies) when injured and of helping, ignoring, or exploiting another in pain (altruistic/nonaltruistic/selfish strategies). Agents modeled in MATLAB interacted at random while foraging (gaining energy); random injury interrupted foraging for a fixed time unless help from an altruistic agent, who paid an energy cost, speeded recovery. Environmental and social conditions also varied, and each model ran for 10,000 iterations. Findings were meaningful in that, in general, contingencies that evident from experimental work with a variety of mammals, over a few interactions, were replicated in the agent-based model after selection pressure over many generations. More energy-demanding expression of pain reduced its frequency in successive generations, and increasing injury frequency resulted in fewer expressers and altruists. Allowing exploitation of injured agents decreased expression of pain to near zero, but altruists remained. Decreasing costs or increasing benefits of helping hardly changed its frequency, whereas increasing interaction rate between injured agents and helpers diminished the benefits to both. Agent-based modeling allows simulation of complex behaviors and environmental pressures over evolutionary time. PMID:26655734

  13. Integration of a Multigrid ODE solver into an open medical simulation framework.

    PubMed

    Wu, Xunlei; Yao, Jianhua; Enquobahrie, Andinet; Lee, Huai-Ping; Audette, Michel A

    2012-01-01

    In this paper, we present the implementation of a Multigrid ODE solver in SOFA framework. By combining the stability advantage of coarse meshes and the transient detail preserving virtue of fine meshes, Multigrid ODE solver computes more efficiently than classic ODE solvers based on a single level discretization. With the ever wider adoption of the SOFA framework in many surgical simulation projects, introducing this Multigrid ODE solver into SOFA's pool of ODE solvers shall benefit the entire community. This contribution potentially has broad ramifications in the surgical simulation research community, given that in a single-resolution system, a constitutively realistic interactive tissue response, which presupposes large elements, is in direct conflict with the need to represent clinically relevant critical tissues in the simulation, which are typically be comprised of small elements. PMID:23366578

  14. Comparing large-scale computational approaches to epidemic modeling: Agent-based versus structured metapopulation models

    PubMed Central

    2010-01-01

    Background In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. Methods We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM) model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. Results The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age breakdown analysis shows

  15. Deterministic Agent-Based Path Optimization by Mimicking the Spreading of Ripples.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Di Paolo, Ezequiel A; Liu, Hao

    2016-01-01

    Inspirations from nature have contributed fundamentally to the development of evolutionary computation. Learning from the natural ripple-spreading phenomenon, this article proposes a novel ripple-spreading algorithm (RSA) for the path optimization problem (POP). In nature, a ripple spreads at a constant speed in all directions, and the node closest to the source is the first to be reached. This very simple principle forms the foundation of the proposed RSA. In contrast to most deterministic top-down centralized path optimization methods, such as Dijkstra's algorithm, the RSA is a bottom-up decentralized agent-based simulation model. Moreover, it is distinguished from other agent-based algorithms, such as genetic algorithms and ant colony optimization, by being a deterministic method that can always guarantee the global optimal solution with very good scalability. Here, the RSA is specifically applied to four different POPs. The comparative simulation results illustrate the advantages of the RSA in terms of effectiveness and efficiency. Thanks to the agent-based and deterministic features, the RSA opens new opportunities to attack some problems, such as calculating the exact complete Pareto front in multiobjective optimization and determining the kth shortest project time in project management, which are very difficult, if not impossible, for existing methods to resolve. The ripple-spreading optimization principle and the new distinguishing features and capacities of the RSA enrich the theoretical foundations of evolutionary computation. PMID:26066805

  16. Comparing large-scale computational approaches to epidemic modeling: agent based versus structured metapopulation models

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruno; Ajelli, Marco; Balcan, Duygu; Colizza, Vittoria; Hu, Hao; Ramasco, José; Merler, Stefano; Vespignani, Alessandro

    2010-03-01

    We provide for the first time a side by side comparison of the results obtained with a stochastic agent based model and a structured metapopulation stochastic model for the evolution of a baseline pandemic event in Italy. The Agent Based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM) model, based on high resolution census data worldwide, and integrating airline travel flow data with short range human mobility patterns at the global scale. Both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing of the order of few days. The age breakdown analysis shows that similar attack rates are obtained for the younger age classes.

  17. Architectural considerations for agent-based national scale policy models : LDRD final report.

    SciTech Connect

    Backus, George A.; Strip, David R.

    2007-09-01

    The need to anticipate the consequences of policy decisions becomes ever more important as the magnitude of the potential consequences grows. The multiplicity of connections between the components of society and the economy makes intuitive assessments extremely unreliable. Agent-based modeling has the potential to be a powerful tool in modeling policy impacts. The direct mapping between agents and elements of society and the economy simplify the mapping of real world functions into the world of computation assessment. Our modeling initiative is motivated by the desire to facilitate informed public debate on alternative policies for how we, as a nation, provide healthcare to our population. We explore the implications of this motivation on the design and implementation of a model. We discuss the choice of an agent-based modeling approach and contrast it to micro-simulation and systems dynamics approaches.

  18. A hierarchical Bayesian framework for force field selection in molecular dynamics simulations.

    PubMed

    Wu, S; Angelikopoulos, P; Papadimitriou, C; Moser, R; Koumoutsakos, P

    2016-02-13

    We present a hierarchical Bayesian framework for the selection of force fields in molecular dynamics (MD) simulations. The framework associates the variability of the optimal parameters of the MD potentials under different environmental conditions with the corresponding variability in experimental data. The high computational cost associated with the hierarchical Bayesian framework is reduced by orders of magnitude through a parallelized Transitional Markov Chain Monte Carlo method combined with the Laplace Asymptotic Approximation. The suitability of the hierarchical approach is demonstrated by performing MD simulations with prescribed parameters to obtain data for transport coefficients under different conditions, which are then used to infer and evaluate the parameters of the MD model. We demonstrate the selection of MD models based on experimental data and verify that the hierarchical model can accurately quantify the uncertainty across experiments; improve the posterior probability density function estimation of the parameters, thus, improve predictions on future experiments; identify the most plausible force field to describe the underlying structure of a given dataset. The framework and associated software are applicable to a wide range of nanoscale simulations associated with experimental data with a hierarchical structure. PMID:26712642

  19. Users' Perception of Medical Simulation Training: A Framework for Adopting Simulator Technology

    ERIC Educational Resources Information Center

    Green, Leili Hayati

    2014-01-01

    Users play a key role in many training strategies, yet some organizations often fail to understand the users' perception after a simulation training implementation, their attitude about acceptance or rejection of and integration of emerging simulation technology in medical training (Gaba, 2007, and Topol, 2012). Several factors are considered to…

  20. Agent-based services for B2B electronic commerce

    NASA Astrophysics Data System (ADS)

    Fong, Elizabeth; Ivezic, Nenad; Rhodes, Tom; Peng, Yun

    2000-12-01

    The potential of agent-based systems has not been realized yet, in part, because of the lack of understanding of how the agent technology supports industrial needs and emerging standards. The area of business-to-business electronic commerce (b2b e-commerce) is one of the most rapidly developing sectors of industry with huge impact on manufacturing practices. In this paper, we investigate the current state of agent technology and the feasibility of applying agent-based computing to b2b e-commerce in the circuit board manufacturing sector. We identify critical tasks and opportunities in the b2b e-commerce area where agent-based services can best be deployed. We describe an implemented agent-based prototype system to facilitate the bidding process for printed circuit board manufacturing and assembly. These activities are taking place within the Internet Commerce for Manufacturing (ICM) project, the NIST- sponsored project working with industry to create an environment where small manufacturers of mechanical and electronic components may participate competitively in virtual enterprises that manufacture printed circuit assemblies.

  1. Modeling civil violence: An agent-based computational approach

    PubMed Central

    Epstein, Joshua M.

    2002-01-01

    This article presents an agent-based computational model of civil violence. Two variants of the civil violence model are presented. In the first a central authority seeks to suppress decentralized rebellion. In the second a central authority seeks to suppress communal violence between two warring ethnic groups. PMID:11997450

  2. An Agent-Based Interface to Terrestrial Ecological Forecasting

    NASA Technical Reports Server (NTRS)

    Golden, Keith; Nemani, Ramakrishna; Pang, Wan-Lin; Votava, Petr; Etzioni, Oren

    2004-01-01

    This paper describes a flexible agent-based ecological forecasting system that combines multiple distributed data sources and models to provide near-real-time answers to questions about the state of the Earth system We build on novel techniques in automated constraint-based planning and natural language interfaces to automatically generate data products based on descriptions of the desired data products.

  3. Flexible simulation framework to couple processes in complex 3D models for subsurface utilization assessment

    NASA Astrophysics Data System (ADS)

    Kempka, Thomas; Nakaten, Benjamin; De Lucia, Marco; Nakaten, Natalie; Otto, Christopher; Pohl, Maik; Tillner, Elena; Kühn, Michael

    2016-04-01

    Utilization of the geological subsurface for production and storage of hydrocarbons, chemical energy and heat as well as for waste disposal requires the quantification and mitigation of environmental impacts as well as the improvement of georesources utilization in terms of efficiency and sustainability. The development of tools for coupled process simulations is essential to tackle these challenges, since reliable assessments are only feasible by integrative numerical computations. Coupled processes at reservoir to regional scale determine the behaviour of reservoirs, faults and caprocks, generally demanding for complex 3D geological models to be considered besides available monitoring and experimenting data in coupled numerical simulations. We have been developing a flexible numerical simulation framework that provides efficient workflows for integrating the required data and software packages to carry out coupled process simulations considering, e.g., multiphase fluid flow, geomechanics, geochemistry and heat. Simulation results are stored in structured data formats to allow for an integrated 3D visualization and result interpretation as well as data archiving and its provision to collaborators. The main benefits in using the flexible simulation framework are the integration of data geological and grid data from any third party software package as well as data export to generic 3D visualization tools and archiving formats. The coupling of the required process simulators in time and space is feasible, while different spatial dimensions in the coupled simulations can be integrated, e.g., 0D batch with 3D dynamic simulations. User interaction is established via high-level programming languages, while computational efficiency is achieved by using low-level programming languages. We present three case studies on the assessment of geological subsurface utilization based on different process coupling approaches and numerical simulations.

  4. Toward an Agent-Based Model of Socially Optimal Water Rights Markets

    NASA Astrophysics Data System (ADS)

    Ehlen, M. A.

    2004-12-01

    There has been considerable interest lately in using public markets for buying and selling the rights to local water usage. Such water rights markets, if designed correctly, should be socially optimal, that is, should sell rights at prices that reflect the true value of water in the region, taking into account that water rights buyers and sellers represent a disparate group of private industry, public authorities, and private users, each having different water needs and different priority to local government. Good market design, however, is hard. As was experienced in California short-run electric power markets, a market design that on paper looks reasonable but in practice is mal-constructed can have devastating effects: firms can learn to manipulate prices by `playing' both sides of the market, and sellers can under-provide so as to create exorbitant prices which buyers have no choice but to pay. Economic theory provides several frameworks for developing a good water rights market design; for example, the structure-conduct-performance paradigm (SCPP) suggests that, among other things, the number and types of buyers and sellers (structure), and transaction clearing rules and government policies (conduct) affect in very particular ways the prices and quantities (performance) in the market. In slow-moving or static markets, SCPP has been a useful predictor of market performance; in faster markets the market dynamics that endogenously develop over time are often too complex to predict with SCPP or other existing modeling techniques. New, more sophisticated combinations of modeling and simulation are needed. Toward developing a good (i.e., socially optimal) water rights market design that can take into account the dynamics inherent in the water sector, we are developing an agent-based model of water rights markets. The model serves two purposes: first, it provides an SCPP-based framework of water rights markets that takes into account the particular structure of

  5. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-01-01

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined. PMID:27323045

  6. Integrated Modeling, Mapping, and Simulation (IMMS) Framework for Exercise and Response Planning

    NASA Technical Reports Server (NTRS)

    Mapar, Jalal; Hoette, Trisha; Mahrous, Karim; Pancerella, Carmen M.; Plantenga, Todd; Yang, Christine; Yang, Lynn; Hopmeier, Michael

    2011-01-01

    EmergenCy management personnel at federal, stale, and local levels can benefit from the increased situational awareness and operational efficiency afforded by simulation and modeling for emergency preparedness, including planning, training and exercises. To support this goal, the Department of Homeland Security's Science & Technology Directorate is funding the Integrated Modeling, Mapping, and Simulation (IMMS) program to create an integrating framework that brings together diverse models for use by the emergency response community. SUMMIT, one piece of the IMMS program, is the initial software framework that connects users such as emergency planners and exercise developers with modeling resources, bridging the gap in expertise and technical skills between these two communities. SUMMIT was recently deployed to support exercise planning for National Level Exercise 2010. Threat, casualty. infrastructure, and medical surge models were combined within SUMMIT to estimate health care resource requirements for the exercise ground truth.

  7. A framework for the design of a novel haptic-based medical training simulator.

    PubMed

    Tahmasebi, Amir M; Hashtrudi-Zaad, Keyvan; Thompson, David; Abolmaesumi, Purang

    2008-09-01

    This paper presents a framework for the design of a haptic-based medical ultrasound training simulator. The proposed simulator is composed of a PHANToM haptic device and a modular software package that allows for visual feedback and kinesthetic interactions between an operator and multimodality image databases. The system provides real-time ultrasound images in the same fashion as a typical ultrasound machine, enhanced with corresponding augmented computerized tomographic (CT) and/or MRI images. The proposed training system allows trainees to develop radiology techniques and knowledge of the patient's anatomy with minimum practice on live patients, or in places or at times when radiology devices or patients with rare cases may not be available. Low-level details of the software structure that can be migrated to other similar medical simulators are described. A preliminary human factors study, conducted on the prototype of the developed simulator, demonstrates the potential usage of the system for clinical training. PMID:18779081

  8. Microworlds, Simulators, and Simulation: Framework for a Benchmark of Human Reliability Data Sources

    SciTech Connect

    Ronald Boring; Dana Kelly; Carol Smidts; Ali Mosleh; Brian Dyre

    2012-06-01

    In this paper, we propose a method to improve the data basis of human reliability analysis (HRA) by extending the data sources used to inform HRA methods. Currently, most HRA methods are based on limited empirical data, and efforts to enhance the empirical basis behind HRA methods have not yet yielded significant new data. Part of the reason behind this shortage of quality data is attributable to the data sources used. Data have been derived from unrelated industries, from infrequent risk-significant events, or from costly control room simulator studies. We propose a benchmark of four data sources: a simplified microworld simulator using unskilled student operators, a full-scope control room simulator using skilled student operators, a full-scope control room simulator using licensed commercial operators, and a human performance modeling and simulation system using virtual operators. The goal of this research is to compare findings across the data sources to determine to what extent data may be used and generalized from cost effective sources.

  9. Simulation-based optimization framework for reuse of agricultural drainage water in irrigation.

    PubMed

    Allam, A; Tawfik, A; Yoshimura, C; Fleifle, A

    2016-05-01

    A simulation-based optimization framework for agricultural drainage water (ADW) reuse has been developed through the integration of a water quality model (QUAL2Kw) and a genetic algorithm. This framework was applied to the Gharbia drain in the Nile Delta, Egypt, in summer and winter 2012. First, the water quantity and quality of the drain was simulated using the QUAL2Kw model. Second, uncertainty analysis and sensitivity analysis based on Monte Carlo simulation were performed to assess QUAL2Kw's performance and to identify the most critical variables for determination of water quality, respectively. Finally, a genetic algorithm was applied to maximize the total reuse quantity from seven reuse locations with the condition not to violate the standards for using mixed water in irrigation. The water quality simulations showed that organic matter concentrations are critical management variables in the Gharbia drain. The uncertainty analysis showed the reliability of QUAL2Kw to simulate water quality and quantity along the drain. Furthermore, the sensitivity analysis showed that the 5-day biochemical oxygen demand, chemical oxygen demand, total dissolved solids, total nitrogen and total phosphorous are highly sensitive to point source flow and quality. Additionally, the optimization results revealed that the reuse quantities of ADW can reach 36.3% and 40.4% of the available ADW in the drain during summer and winter, respectively. These quantities meet 30.8% and 29.1% of the drainage basin requirements for fresh irrigation water in the respective seasons. PMID:26921569

  10. An Agent-Based Model of Signal Transduction in Bacterial Chemotaxis

    PubMed Central

    Miller, Jameson; Parker, Miles; Bourret, Robert B.; Giddings, Morgan C.

    2010-01-01

    We report the application of agent-based modeling to examine the signal transduction network and receptor arrays for chemotaxis in Escherichia coli, which are responsible for regulating swimming behavior in response to environmental stimuli. Agent-based modeling is a stochastic and bottom-up approach, where individual components of the modeled system are explicitly represented, and bulk properties emerge from their movement and interactions. We present the Chemoscape model: a collection of agents representing both fixed membrane-embedded and mobile cytoplasmic proteins, each governed by a set of rules representing knowledge or hypotheses about their function. When the agents were placed in a simulated cellular space and then allowed to move and interact stochastically, the model exhibited many properties similar to the biological system including adaptation, high signal gain, and wide dynamic range. We found the agent based modeling approach to be both powerful and intuitive for testing hypotheses about biological properties such as self-assembly, the non-linear dynamics that occur through cooperative protein interactions, and non-uniform distributions of proteins in the cell. We applied the model to explore the role of receptor type, geometry and cooperativity in the signal gain and dynamic range of the chemotactic response to environmental stimuli. The model provided substantial qualitative evidence that the dynamic range of chemotactic response can be traced to both the heterogeneity of receptor types present, and the modulation of their cooperativity by their methylation state. PMID:20485527

  11. Using Uncertainty and Sensitivity Analyses in Socioecological Agent-Based Models to Improve Their Analytical Performance and Policy Relevance

    PubMed Central

    Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764

  12. Using uncertainty and sensitivity analyses in socioecological agent-based models to improve their analytical performance and policy relevance.

    PubMed

    Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764

  13. A Two-Stage Multi-Agent Based Assessment Approach to Enhance Students' Learning Motivation through Negotiated Skills Assessment

    ERIC Educational Resources Information Center

    Chadli, Abdelhafid; Bendella, Fatima; Tranvouez, Erwan

    2015-01-01

    In this paper we present an Agent-based evaluation approach in a context of Multi-agent simulation learning systems. Our evaluation model is based on a two stage assessment approach: (1) a Distributed skill evaluation combining agents and fuzzy sets theory; and (2) a Negotiation based evaluation of students' performance during a training…

  14. A Framework for Simulation of Aircraft Flyover Noise Through a Non-Standard Atmosphere

    NASA Technical Reports Server (NTRS)

    Arntzen, Michael; Rizzi, Stephen A.; Visser, Hendrikus G.; Simons, Dick G.

    2012-01-01

    This paper describes a new framework for the simulation of aircraft flyover noise through a non-standard atmosphere. Central to the framework is a ray-tracing algorithm which defines multiple curved propagation paths, if the atmosphere allows, between the moving source and listener. Because each path has a different emission angle, synthesis of the sound at the source must be performed independently for each path. The time delay, spreading loss and absorption (ground and atmosphere) are integrated along each path, and applied to each synthesized aircraft noise source to simulate a flyover. A final step assigns each resulting signal to its corresponding receiver angle for the simulation of a flyover in a virtual reality environment. Spectrograms of the results from a straight path and a curved path modeling assumption are shown. When the aircraft is at close range, the straight path results are valid. Differences appear especially when the source is relatively far away at shallow elevation angles. These differences, however, are not significant in common sound metrics. While the framework used in this work performs off-line processing, it is conducive to real-time implementation.

  15. A 3D MPI-Parallel GPU-accelerated framework for simulating ocean wave energy converters

    NASA Astrophysics Data System (ADS)

    Pathak, Ashish; Raessi, Mehdi

    2015-11-01

    We present an MPI-parallel GPU-accelerated computational framework for studying the interaction between ocean waves and wave energy converters (WECs). The computational framework captures the viscous effects, nonlinear fluid-structure interaction (FSI), and breaking of waves around the structure, which cannot be captured in many potential flow solvers commonly used for WEC simulations. The full Navier-Stokes equations are solved using the two-step projection method, which is accelerated by porting the pressure Poisson equation to GPUs. The FSI is captured using the numerically stable fictitious domain method. A novel three-phase interface reconstruction algorithm is used to resolve three phases in a VOF-PLIC context. A consistent mass and momentum transport approach enables simulations at high density ratios. The accuracy of the overall framework is demonstrated via an array of test cases. Numerical simulations of the interaction between ocean waves and WECs are presented. Funding from the National Science Foundation CBET-1236462 grant is gratefully acknowledged.

  16. Ximpol: a new X-ray polarimetry observation-simulation and analysis framework

    NASA Astrophysics Data System (ADS)

    Baldini, Luca; Muleri, Fabio; Soffitta, Paolo; Omodei, Nicola; Pesce-Rollins, Melissa; Sgro, Carmelo; Latronico, Luca; Spada, Francesca; Manfreda, Alberto; Di Lalla, Niccolo

    2016-07-01

    We present a new simulation framework, ximpol, based on the Python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. ximpol is designed to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC---which make it a useful tool not only for simulating observations of astronomical sources, but also to develop and test end-to-end analysis chains. In this contribution we shall give an overview of the basic architecture of the software. Although in principle the framework is not tied to any specific mission or instrument design we shall present a few physically interesting case studies in the context of the XIPE mission phase study.

  17. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems

  18. SMART: A New Semi-distributed Hydrologic Modelling Framework for Soil Moisture and Runoff Simulations

    NASA Astrophysics Data System (ADS)

    Ajami, Hoori; Sharma, Ashish

    2016-04-01

    A new GIS-based semi-distributed hydrological modelling framework is developed based upon the delineation of contiguous and topologically connected Hydrologic Response Units (HRUs). The Soil Moisture and Runoff simulation Toolkit (SMART) performs topographic and geomorphic analysis of a catchment and delineates HRUs in each first order sub-basin. This HRU delineation approach maintains lateral flow dynamics in first order sub-basins and therefore it is suited for simulating runoff in upland catchments. Simulation elements in SMART are distributed cross sections or equivalent cross sections (ECS) in each first order sub-basin to represent hillslope hydrologic processes. Delineation of ECSs in SMART is performed by weighting the topographic and physiographic properties of the part or entire first-order sub-basin and has the advantage of reducing computational time/effort while maintaining reasonable accuracy in simulated hydrologic state and fluxes (e.g. soil moisture, evapotranspiration and runoff). SMART workflow is written in MATLAB to automate the HRU and cross section delineations, model simulations across multiple cross sections, and post-processing of model outputs to visualize the results. The MATLAB Parallel Processing Toolbox is used for simultaneous simulations of cross sections and is further reduced computational time. SMART workflow tasks are: 1) delineation of first order sub-basins of a catchment using a digital elevation model, 2) hillslope delineation, 3) landform delineation in every first order sub-basin based on topographic and geomorphic properties of a group of sub-basins or the entire catchment, 4) formulation of cross sections as well as equivalent cross sections in every first order sub-basin, and 5) deriving vegetation and soil parameters from spatially distributed land cover and soil information. The current version of SMART uses a 2-d distributed hydrological model based on the Richards' equation. However, any hydrologic model can be

  19. A framework for simulating ultrasound imaging based on first order nonlinear pressure-velocity relations.

    PubMed

    Du, Yigang; Fan, Rui; Li, Yong; Chen, Siping; Jensen, Jørgen Arendt

    2016-07-01

    An ultrasound imaging framework modeled with the first order nonlinear pressure-velocity relations (NPVR) and implemented by a half-time staggered solution and pseudospectral method is presented in this paper. The framework is capable of simulating linear and nonlinear ultrasound propagation and reflections in a heterogeneous medium with different sound speeds and densities. It can be initialized with arbitrary focus, excitation and apodization for multiple individual channels in both 2D and 3D spatial fields. The simulated channel data can be generated using this framework, and ultrasound image can be obtained by beamforming the simulated channel data. Various results simulated by different algorithms are illustrated for comparisons. The root mean square (RMS) errors for each compared pulses are calculated. The linear propagation is validated by an angular spectrum approach (ASA) with a RMS error of 3% at the focal point for a 2D field, and Field II with RMS errors of 0.8% and 1.5% at the electronic and the elevation focuses for 3D fields, respectively. The accuracy for the NPVR based nonlinear propagation is investigated by comparing with the Abersim simulation for pulsed fields and with the nonlinear ASA for monochromatic fields. The RMS errors of the nonlinear pulses calculated by the NPVR and Abersim are respectively 2.4%, 7.4%, 17.6% and 36.6% corresponding to initial pressure amplitudes of 50kPa, 200kPa, 500kPa and 1MPa at the transducer. By increasing the sampling frequency for the strong nonlinearity, the RMS error for 1MPa initial pressure amplitude is reduced from 36.6% to 27.3%. PMID:27107165

  20. The fractional volatility model: An agent-based interpretation

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2008-06-01

    Based on the criteria of mathematical simplicity and consistency with empirical market data, a model with volatility driven by fractional noise has been constructed which provides a fairly accurate mathematical parametrization of the data. Here, some features of the model are reviewed and extended to account for leverage effects. Using agent-based models, one tries to find which agent strategies and (or) properties of the financial institutions might be responsible for the features of the fractional volatility model.

  1. Exploring Tradeoffs in Demand-side and Supply-side Management of Urban Water Resources using Agent-based Modeling and Evolutionary Computation

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Berglund, E. Z.

    2015-12-01

    Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS) framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger (1) increases in the volume of water pumped through inter-basin transfers from an external reservoir and (2) drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  2. DELPHES 3: a modular framework for fast simulation of a generic collider experiment

    NASA Astrophysics Data System (ADS)

    de Favereau, J.; Delaere, C.; Demin, P.; Giammanco, A.; Lemaître, V.; Mertens, A.; Selvaggi, M.

    2014-02-01

    The version 3.0 of the Delphes fast-simulation is presented. The goal of Delphes is to allow the simulation of a multipurpose detector for phenomenological studies. The simulation includes a track propagation system embedded in a magnetic field, electromagnetic and hadron calorimeters, and a muon identification system. Physics objects that can be used for data analysis are then reconstructed from the simulated detector response. These include tracks and calorimeter deposits and high level objects such as isolated electrons, jets, taus, and missing energy. The new modular approach allows for greater flexibility in the design of the simulation and reconstruction sequence. New features such as the particle-flow reconstruction approach, crucial in the first years of the LHC, and pile-up simulation and mitigation, which is needed for the simulation of the LHC detectors in the near future, have also been implemented. The Delphes framework is not meant to be used for advanced detector studies, for which more accurate tools are needed. Although some aspects of Delphes are hadron collider specific, it is flexible enough to be adapted to the needs of electron-positron collider experiments. [Figure not available: see fulltext.

  3. An Object-Oriented Finite Element Framework for Multiphysics Phase Field Simulations

    SciTech Connect

    Michael R Tonks; Derek R Gaston; Paul C Millett; David Andrs; Paul Talbot

    2012-01-01

    The phase field approach is a powerful and popular method for modeling microstructure evolution. In this work, advanced numerical tools are used to create a phase field framework that facilitates rapid model development. This framework, called MARMOT, is based on Idaho National Laboratory's finite element Multiphysics Object-Oriented Simulation Environment. In MARMOT, the system of phase field partial differential equations (PDEs) are solved simultaneously with PDEs describing additional physics, such as solid mechanics and heat conduction, using the Jacobian-Free Newton Krylov Method. An object-oriented architecture is created by taking advantage of commonalities in phase fields models to facilitate development of new models with very little written code. In addition, MARMOT provides access to mesh and time step adaptivity, reducing the cost for performing simulations with large disparities in both spatial and temporal scales. In this work, phase separation simulations are used to show the numerical performance of MARMOT. Deformation-induced grain growth and void growth simulations are included to demonstrate the muliphysics capability.

  4. A Simulation Framework for Quantitative Validation of Artefact Correction in Diffusion MRI.

    PubMed

    Graham, Mark S; Drobnjak, Ivana; Zhang, Hui

    2015-01-01

    In this paper we demonstrate a simulation framework that enables the direct and quantitative comparison of post-processing methods for diffusion weighted magnetic resonance (DW-MR) images. DW-MR datasets are employed in a range of techniques that enable estimates of local microstructure and global connectivity in the brain. These techniques require full alignment of images across the dataset, but this is rarely the case. Artefacts such as eddy-current (EC) distortion and motion lead to misalignment between images, which compromise the quality of the microstructural measures obtained from them. Numerous methods and software packages exist to correct these artefacts, some of which have become de-facto standards, but none have been subject to rigorous validation. The ultimate aim of these techniques is improved image alignment, yet in the literature this is assessed using either qualitative visual measures or quantitative surrogate metrics. Here we introduce a simulation framework that allows for the direct, quantitative assessment of techniques, enabling objective comparisons of existing and future methods. DW-MR datasets are generated using a process that is based on the physics of MRI acquisition, which allows for the salient features of the images and their artefacts to be reproduced. We demonstrate the application of this framework by testing one of the most commonly used methods for EC correction, registration of DWIs to b = 0, and reveal the systematic bias this introduces into corrected datasets. PMID:26221709

  5. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    SciTech Connect

    Becker, Jillian; Bridge, Pete; Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet

    2015-06-15

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.

  6. Genetic Algorithms for Agent-Based Infrastructure Interdependency Modeling and Analysis

    SciTech Connect

    May Permann

    2007-03-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, electric power, telecommunication, and financial networks. This paper describes initial research combining agent-based infrastructure modeling software and genetic algorithms (GAs) to help optimize infrastructure protection and restoration decisions. This research proposes to apply GAs to the problem of infrastructure modeling and analysis in order to determine the optimum assets to restore or protect from attack or other disaster. This research is just commencing and therefore the focus of this paper is the integration of a GA optimization method with a simulation through the simulation’s agents.

  7. Exploring walking differences by socioeconomic status using a spatial agent-based model

    PubMed Central

    Yang, Yong; Diez Roux, Ana V.; Auchincloss, Amy H.; Rodriguez, Daniel A.; Brown, Daniel G.

    2012-01-01

    We use an exploratory agent-based model of adults’ walking behavior within a city to examine the possible impact of interventions on socioeconomic differences in walking. Simulated results show that for persons of low socioeconomic status, increases in walking resulting from increases in their positive attitude towards walking may diminish over time if other features of the environment are not conducive to walking. Similarly, improving the safety level for the lower SES neighborhoods may be effective in increasing walking, however, the magnitude of its effectiveness varies by levels of land use mix, such that effects of safety are greatest when persons live in areas with a large mix of uses. PMID:22243911

  8. An agent-based computational model of the spread of tuberculosis

    NASA Astrophysics Data System (ADS)

    de Espíndola, Aquino L.; Bauch, Chris T.; Troca Cabella, Brenno C.; Souto Martinez, Alexandre

    2011-05-01

    In this work we propose an alternative model of the spread of tuberculosis (TB) and the emergence of drug resistance due to the treatment with antibiotics. We implement the simulations by an agent-based model computational approach where the spatial structure is taken into account. The spread of tuberculosis occurs according to probabilities defined by the interactions among individuals. The model was validated by reproducing results already known from the literature in which different treatment regimes yield the emergence of drug resistance. The different patterns of TB spread can be visualized at any time of the system evolution. The implementation details as well as some results of this alternative approach are discussed.

  9. Diffusion and Aggregation in an Agent Based Model of Stock Market Fluctuations

    NASA Astrophysics Data System (ADS)

    Castiglione, Filippo

    We describe a new model to simulate the dynamic interactions between market price and the decisions of two different kind of traders. They possess spatial mobility allowing to group together to form coalitions. Each coalition follows a strategy chosen from a proportional voting ``dominated'' by a leader's decision. The interplay of both kind of agents gives rise to complex price dynamics that is consistent with the main stylized facts of financial time series. The present model incorporates many features of other known models and is meant to be the first step toward the construction of an agent-based model that uses more realistic markets rules, strategies, and information structures.

  10. Agent Based Modeling of Collaboration and Work Practices Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Acquisti, Alessandro; Sierhuis, Maarten; Clancey, William J.; Bradshaw, Jeffrey M.; Shaffo, Mike (Technical Monitor)

    2002-01-01

    The International Space Station is one the most complex projects ever, with numerous interdependent constraints affecting productivity and crew safety. This requires planning years before crew expeditions, and the use of sophisticated scheduling tools. Human work practices, however, are difficult to study and represent within traditional planning tools. We present an agent-based model and simulation of the activities and work practices of astronauts onboard the ISS based on an agent-oriented approach. The model represents 'a day in the life' of the ISS crew and is developed in Brahms, an agent-oriented, activity-based language used to model knowledge in situated action and learning in human activities.

  11. The epitheliome: agent-based modelling of the social behaviour of cells.

    PubMed

    Walker, D C; Southgate, J; Hill, G; Holcombe, M; Hose, D R; Wood, S M; Mac Neil, S; Smallwood, R H

    2004-01-01

    We have developed a new computational modelling paradigm for predicting the emergent behaviour resulting from the interaction of cells in epithelial tissue. As proof-of-concept, an agent-based model, in which there is a one-to-one correspondence between biological cells and software agents, has been coupled to a simple physical model. Behaviour of the computational model is compared with the growth characteristics of epithelial cells in monolayer culture, using growth media with low and physiological calcium concentrations. Results show a qualitative fit between the growth characteristics produced by the simulation and the in vitro cell models. PMID:15351133

  12. Exploring walking differences by socioeconomic status using a spatial agent-based model.

    PubMed

    Yang, Yong; Diez Roux, Ana V; Auchincloss, Amy H; Rodriguez, Daniel A; Brown, Daniel G

    2012-01-01

    We use an exploratory agent-based model of adults' walking behavior within a city to examine the possible impact of interventions on socioeconomic differences in walking. Simulated results show that for persons of low socioeconomic status, increases in walking resulting from increases in their positive attitude towards walking may diminish over time if other features of the environment are not conducive to walking. Similarly, improving the safety level for the lower SES neighborhoods may be effective in increasing walking, however, the magnitude of its effectiveness varies by levels of land use mix, such that effects of safety are greatest when persons live in areas with a large mix of uses. PMID:22243911

  13. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    PubMed Central

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology. PMID:26441628

  14. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations.

    PubMed

    Hahne, Jan; Helias, Moritz; Kunkel, Susanne; Igarashi, Jun; Bolten, Matthias; Frommer, Andreas; Diesmann, Markus

    2015-01-01

    Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy in the presence of gap junctions, we present benchmarks for workstations, clusters, and supercomputers. Finally, we discuss limitations of the novel technology. PMID:26441628

  15. Introducing FACETS, the Framework Application for Core-Edge Transport Simulations

    SciTech Connect

    Cary, John R.; Candy, Jeff; Cohen, Ronald H.; Krasheninnikov, Sergei I; McCune, Douglas C; Estep, Donald J; Larson, Jay W; Malony, Allen; Worley, Patrick H; Carlsson, Johann Anders; Hakim, A H; Hamill, P; Kruger, Scott E; Muzsala, S; Pletzer, Alexander; Shasharina, Svetlana; Wade-Stein, D; Wang, N; McInnes, Lois C; Wildey, T; Casper, T. A.; Diachin, Lori A; Epperly, Thomas; Rognlien, T. D.; Fahey, Mark R; Kuehn, Jeffery A; Morris, A; Shende, Sameer; Feibush, E; Hammett, Gregory W; Indireshkumar, K; Ludescher, C; Randerson, L; Stotler, D.; Pigarov, A; Bonoli, P.; Chang, C S; D'Ippolito, D. A.; Colella, Philip; Keyes, David E; Bramley, R; Myra, J. R.

    2007-06-01

    The FACETS (Framework Application for Core-Edge Transport Simulations) project began in January 2007 with the goal of providing core to wall transport modeling of a tokamak fusion reactor. This involves coupling previously separate computations for the core, edge, and wall regions. Such a coupling is primarily through connection regions of lower dimensionality. The project has started developing a component-based coupling framework to bring together models for each of these regions. In the first year, the core model will be a 1 dimensional model (1D transport across flux surfaces coupled to a 2D equilibrium) with fixed equilibrium. The initial edge model will be the fluid model, UEDGE, but inclusion of kinetic models is planned for the out years. The project also has an embedded Scientific Application Partnership that is examining embedding a full-scale turbulence model for obtaining the crosssurface fluxes into a core transport code.

  16. The ADAQ framework: An integrated toolkit for data acquisition and analysis with real and simulated radiation detectors

    NASA Astrophysics Data System (ADS)

    Hartwig, Zachary S.

    2016-04-01

    The ADAQ framework is a collection of software tools that is designed to streamline the acquisition and analysis of radiation detector data produced in modern digital data acquisition (DAQ) systems and in Monte Carlo detector simulations. The purpose of the framework is to maximize user scientific productivity by minimizing the effort and expertise required to fully utilize radiation detectors in a variety of scientific and engineering disciplines. By using a single set of tools to span the real and simulation domains, the framework eliminates redundancy and provides an integrated workflow for high-fidelity comparison between experimental and simulated detector performance. Built on the ROOT data analysis framework, the core of the ADAQ framework is a set of C++ and Python libraries that enable high-level control of digital DAQ systems and detector simulations with data stored into standardized binary ROOT files for further analysis. Two graphical user interface programs utilize the libraries to create powerful tools: ADAQAcquisition handles control and readout of real-world DAQ systems and ADAQAnalysis provides data analysis and visualization methods for experimental and simulated data. At present, the ADAQ framework supports digital DAQ hardware from CAEN S.p.A. and detector simulations performed in Geant4; however, the modular design will facilitate future extension to other manufacturers and simulation platforms.

  17. DDG4 A Simulation Framework based on the DD4hep Detector Description Toolkit

    NASA Astrophysics Data System (ADS)

    Frank, M.; Gaede, F.; Nikiforou, N.; Petric, M.; Sailer, A.

    2015-12-01

    The detector description is an essential component that has to be used to analyse and simulate data resulting from particle collisions in high energy physics experiments. Based on the DD4hep detector description toolkit a flexible and data driven simulation framework was designed using the Geant4 tool-kit. We present this framework and describe the guiding requirements and the architectural design, which was strongly driven by ease of use. The goal was, given an existing detector description, to simulate the detector response to particle collisions in high energy physics experiments with minimal effort, but not impose restrictions to support enhanced or improved behaviour. Starting from the ROOT based geometry implementation used by DD4hep an automatic conversion mechanism to Geant4 was developed. The physics response and the mechanism to input particle data from generators was highly formalized and can be instantiated on demand using known factory patterns. A palette of components to model the detector response is provided by default, but improved or more sophisticated components may easily be added using the factory pattern. Only the final configuration of the instantiated components has to be provided by end-users using either C++ or python scripting or an XML based description.

  18. The Application of Modeling and Simulation in Capacity Management within the ITIL Framework

    NASA Technical Reports Server (NTRS)

    Rahmani, Sonya; vonderHoff, Otto

    2010-01-01

    Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.

  19. Is the Person-Situation Debate Important for Agent-Based Modeling and Vice-Versa?

    PubMed Central

    Sznajd-Weron, Katarzyna; Szwabiński, Janusz; Weron, Rafał

    2014-01-01

    Background Agent-based models (ABM) are believed to be a very powerful tool in the social sciences, sometimes even treated as a substitute for social experiments. When building an ABM we have to define the agents and the rules governing the artificial society. Given the complexity and our limited understanding of the human nature, we face the problem of assuming that either personal traits, the situation or both have impact on the social behavior of agents. However, as the long-standing person-situation debate in psychology shows, there is no consensus as to the underlying psychological mechanism and the important question that arises is whether the modeling assumptions we make will have a substantial influence on the simulated behavior of the system as a whole or not. Methodology/Principal Findings Studying two variants of the same agent-based model of opinion formation, we show that the decision to choose either personal traits or the situation as the primary factor driving social interactions is of critical importance. Using Monte Carlo simulations (for Barabasi-Albert networks) and analytic calculations (for a complete graph) we provide evidence that assuming a person-specific response to social influence at the microscopic level generally leads to a completely different and less realistic aggregate or macroscopic behavior than an assumption of a situation-specific response; a result that has been reported by social psychologists for a range of experimental setups, but has been downplayed or ignored in the opinion dynamics literature. Significance This sensitivity to modeling assumptions has far reaching consequences also beyond opinion dynamics, since agent-based models are becoming a popular tool among economists and policy makers and are often used as substitutes of real social experiments. PMID:25369531

  20. Exploring complex dynamics in multi agent-based intelligent systems: Theoretical and experimental approaches using the Multi Agent-based Behavioral Economic Landscape (MABEL) model

    NASA Astrophysics Data System (ADS)

    Alexandridis, Konstantinos T.

    This dissertation adopts a holistic and detailed approach to modeling spatially explicit agent-based artificial intelligent systems, using the Multi Agent-based Behavioral Economic Landscape (MABEL) model. The research questions that addresses stem from the need to understand and analyze the real-world patterns and dynamics of land use change from a coupled human-environmental systems perspective. Describes the systemic, mathematical, statistical, socio-economic and spatial dynamics of the MABEL modeling framework, and provides a wide array of cross-disciplinary modeling applications within the research, decision-making and policy domains. Establishes the symbolic properties of the MABEL model as a Markov decision process, analyzes the decision-theoretic utility and optimization attributes of agents towards comprising statistically and spatially optimal policies and actions, and explores the probabilogic character of the agents' decision-making and inference mechanisms via the use of Bayesian belief and decision networks. Develops and describes a Monte Carlo methodology for experimental replications of agent's decisions regarding complex spatial parcel acquisition and learning. Recognizes the gap on spatially-explicit accuracy assessment techniques for complex spatial models, and proposes an ensemble of statistical tools designed to address this problem. Advanced information assessment techniques such as the Receiver-Operator Characteristic curve, the impurity entropy and Gini functions, and the Bayesian classification functions are proposed. The theoretical foundation for modular Bayesian inference in spatially-explicit multi-agent artificial intelligent systems, and the ensembles of cognitive and scenario assessment modular tools build for the MABEL model are provided. Emphasizes the modularity and robustness as valuable qualitative modeling attributes, and examines the role of robust intelligent modeling as a tool for improving policy-decisions related to land

  1. The dynamic information architecture system : an advanced simulation framework for military and civilian applications.

    SciTech Connect

    Campbell, A. P.; Hummel, J. R.

    1998-01-08

    DIAS, the Dynamic Information Architecture System, is an object-oriented simulation system that was designed to provide an integrating framework in which new or legacy software applications can operate in a context-driven frame of reference. DIAS provides a flexible and extensible mechanism to allow disparate, and mixed language, software applications to interoperate. DIAS captures the dynamic interplay between different processes or phenomena in the same frame of reference. Finally, DIAS accommodates a broad range of analysis contexts, with widely varying spatial and temporal resolutions and fidelity.

  2. Towards a unified framework for coarse-graining particle-based simulations.

    SciTech Connect

    Junghans, Christoph

    2012-06-28

    Different coarse-graining techniques for soft matter systems have been developed in recent years, however it is often very demanding to find the method most suitable for the problem studied. For this reason we began to develop the VOTCA toolkit to allow for easy comparison of different methods. We have incorporated 6 different techniques into the package and implemented a powerful and parallel analysis framework plus multiple simulation back-ends. We will discuss the specifics of the package by means of various studies, which have been performed with the toolkit and highlight problems we encountered along the way.

  3. Review of Molecular Simulations of Methane Storage in Metal-Organic Frameworks.

    PubMed

    Lee, Seung-Joon; Bae, Youn-Sang

    2016-05-01

    Methane storage in porous materials is one of the hot issues because it can replace dangerous high-pressure compressed natural gas (CNG) tanks in natural gas vehicles. Among the diverse adsorbents, metal-organic frameworks (MOFs) are considered to be promising due to their extremely high surface areas and low crystal densities. Molecular simulation has been considered as an important tool for finding an appropriate MOF for methane storage. We review several important roles of molecular modeling for the studies of methane adsorption in MOFs. PMID:27483748

  4. Managing simulation-based training: A framework for optimizing learning, cost, and time

    NASA Astrophysics Data System (ADS)

    Richmond, Noah Joseph

    This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.

  5. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  6. Direct numerical simulation of rigid bodies in multiphase flow within an Eulerian framework

    NASA Astrophysics Data System (ADS)

    Rauschenberger, P.; Weigand, B.

    2015-06-01

    A new method is presented to simulate rigid body motion in the Volume-of-Fluid based multiphase code Free Surface 3D. The specific feature of the new method is that it works within an Eulerian framework without the need for a Lagrangian representation of rigid bodies. Several test cases are shown to prove the validity of the numerical scheme. The technique is able to conserve the shape of arbitrarily shaped rigid bodies and predict terminal velocities of rigid spheres. The instability of a falling ellipsoid is captured. Multiple rigid bodies including collisions may be considered using only one Volume-of-Fluid variable which allows to simulate the drafting, kissing and tumbling phenomena of two rigid spheres. The method can easily be extended to rigid bodies undergoing phase change processes.

  7. Simulating collisions of thick nuclei in the color glass condensate framework

    NASA Astrophysics Data System (ADS)

    Gelfand, Daniil; Ipp, Andreas; Müller, David

    2016-07-01

    We present our work on the simulation of the early stages of heavy-ion collisions with finite longitudinal thickness in the laboratory frame in 3 +1 dimensions. In particular we study the effects of nuclear thickness on the production of a glasma state in the McLerran-Venugopalan model within the color glass condensate framework. A finite thickness enables us to describe nuclei at lower energies, but forces us to abandon boost invariance. As a consequence, random classical color sources within the nuclei have to be included in the simulation, which is achieved by using the colored particle-in-cell method. We show that the description in the laboratory frame agrees with boost-invariant approaches as a limiting case. Furthermore we investigate collisions beyond boost invariance, in particular the pressure anisotropy in the glasma.

  8. A proposed simulation optimization model framework for emergency department problems in public hospital

    NASA Astrophysics Data System (ADS)

    Ibrahim, Ireen Munira; Liong, Choong-Yeun; Bakar, Sakhinah Abu; Ahmad, Norazura; Najmuddin, Ahmad Farid

    2015-12-01

    The Emergency Department (ED) is a very complex system with limited resources to support increase in demand. ED services are considered as good quality if they can meet the patient's expectation. Long waiting times and length of stay is always the main problem faced by the management. The management of ED should give greater emphasis on their capacity of resources in order to increase the quality of services, which conforms to patient satisfaction. This paper is a review of work in progress of a study being conducted in a government hospital in Selangor, Malaysia. This paper proposed a simulation optimization model framework which is used to study ED operations and problems as well as to find an optimal solution to the problems. The integration of simulation and optimization is hoped can assist management in decision making process regarding their resource capacity planning in order to improve current and future ED operations.

  9. CRUSDE: A plug-in based simulation framework for composable CRUstal DEformation studies using Green's functions

    NASA Astrophysics Data System (ADS)

    Grapenthin, R.

    2014-01-01

    CRUSDE is a plug-in based simulation framework written in C/C++ for Linux platforms (installation information, download and test cases: http://www.grapenthin.org/crusde). It utilizes Green's functions for simulations of the Earth's response to changes in surface loads. Such changes could involve, for example, melting glaciers, oscillating snow loads, or lava flow emplacement. The focus in the simulation could be the response of the Earth's crust in terms of stress changes, changes in strain rates, or simply uplift or subsidence and the respective horizontal displacements of the crust (over time). Rather than implementing a variety of specific models, CRUSDE approaches crustal deformation problems from a general formulation in which model elements (Green's function, load function, relaxation function, load history), operators, pre- and postprocessors, as well as input and output routines are independent, exchangeable, and reusable on the basis of a plug-in approach (shared libraries loaded at runtime). We derive the general formulation CRUSDE is based on, describe its architecture and use, and demonstrate its capabilities in a test case. With CRUSDE users can: (1) dynamically select software components to participate in a simulation (through XML experiment definitions), (2) extend the framework independently with new software components and reuse existing ones, and (3) exchange software components and experiment definitions with other users. CRUSDE's plug-in mechanism aims for straightforward extendability allowing modelers to add new Earth models/response functions. Current Green's function implementations include surface displacements due to the elastic response, final relaxed response, and pure thick plate response for a flat Earth. These can be combined to express exponential decay from elastic to final relaxed response, displacement rates due to one or multiple disks, irregular loads, or a combination of these. Each load can have its own load history and

  10. A Probabilistic Framework for the Validation and Certification of Computer Simulations

    NASA Technical Reports Server (NTRS)

    Ghanem, Roger; Knio, Omar

    2000-01-01

    The paper presents a methodology for quantifying, propagating, and managing the uncertainty in the data required to initialize computer simulations of complex phenomena. The purpose of the methodology is to permit the quantitative assessment of a certification level to be associated with the predictions from the simulations, as well as the design of a data acquisition strategy to achieve a target level of certification. The value of a methodology that can address the above issues is obvious, specially in light of the trend in the availability of computational resources, as well as the trend in sensor technology. These two trends make it possible to probe physical phenomena both with physical sensors, as well as with complex models, at previously inconceivable levels. With these new abilities arises the need to develop the knowledge to integrate the information from sensors and computer simulations. This is achieved in the present work by tracing both activities back to a level of abstraction that highlights their commonalities, thus allowing them to be manipulated in a mathematically consistent fashion. In particular, the mathematical theory underlying computer simulations has long been associated with partial differential equations and functional analysis concepts such as Hilbert spares and orthogonal projections. By relying on a probabilistic framework for the modeling of data, a Hilbert space framework emerges that permits the modeling of coefficients in the governing equations as random variables, or equivalently, as elements in a Hilbert space. This permits the development of an approximation theory for probabilistic problems that parallels that of deterministic approximation theory. According to this formalism, the solution of the problem is identified by its projection on a basis in the Hilbert space of random variables, as opposed to more traditional techniques where the solution is approximated by its first or second-order statistics. The present

  11. Parallel kinetic Monte Carlo simulation framework incorporating accurate models of adsorbate lateral interactions

    SciTech Connect

    Nielsen, Jens; D’Avezac, Mayeul; Hetherington, James; Stamatakis, Michail

    2013-12-14

    Ab initio kinetic Monte Carlo (KMC) simulations have been successfully applied for over two decades to elucidate the underlying physico-chemical phenomena on the surfaces of heterogeneous catalysts. These simulations necessitate detailed knowledge of the kinetics of elementary reactions constituting the reaction mechanism, and the energetics of the species participating in the chemistry. The information about the energetics is encoded in the formation energies of gas and surface-bound species, and the lateral interactions between adsorbates on the catalytic surface, which can be modeled at different levels of detail. The majority of previous works accounted for only pairwise-additive first nearest-neighbor interactions. More recently, cluster-expansion Hamiltonians incorporating long-range interactions and many-body terms have been used for detailed estimations of catalytic rate [C. Wu, D. J. Schmidt, C. Wolverton, and W. F. Schneider, J. Catal. 286, 88 (2012)]. In view of the increasing interest in accurate predictions of catalytic performance, there is a need for general-purpose KMC approaches incorporating detailed cluster expansion models for the adlayer energetics. We have addressed this need by building on the previously introduced graph-theoretical KMC framework, and we have developed Zacros, a FORTRAN2003 KMC package for simulating catalytic chemistries. To tackle the high computational cost in the presence of long-range interactions we introduce parallelization with OpenMP. We further benchmark our framework by simulating a KMC analogue of the NO oxidation system established by Schneider and co-workers [J. Catal. 286, 88 (2012)]. We show that taking into account only first nearest-neighbor interactions may lead to large errors in the prediction of the catalytic rate, whereas for accurate estimates thereof, one needs to include long-range terms in the cluster expansion.

  12. A multi-fidelity framework for physics based rotor blade simulation and optimization

    NASA Astrophysics Data System (ADS)

    Collins, Kyle Brian

    with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of

  13. An advanced object-based software framework for complex ecosystem modeling and simulation

    SciTech Connect

    Sydelko, P. J.; Dolph, J. E.; Majerus, K. A.; Taxon, T. N.

    2000-06-29

    Military land managers and decision makers face an ever increasing challenge to balance maximum flexibility for the mission with a diverse set of multiple land use, social, political, and economic goals. In addition, these goals encompass environmental requirements for maintaining ecosystem health and sustainability over the long term. Spatiotemporal modeling and simulation in support of adaptive ecosystem management can be best accomplished through a dynamic, integrated, and flexible approach that incorporates scientific and technological components into a comprehensive ecosystem modeling framework. The Integrated Dynamic Landscape Analysis and Modeling System (IDLAMS) integrates ecological models and decision support techniques through a geographic information system (GIS)-based backbone. Recently, an object-oriented (OO) architectural framework was developed for IDLAMS (OO-IDLAMS). This OO-IDLAMS Prototype was built upon and leverages from the Dynamic Information Architecture System (DIAS) developed by Argonne National Laboratory. DIAS is an object-based architectural framework that affords a more integrated, dynamic, and flexible approach to comprehensive ecosystem modeling than was possible with the GIS-based integration approach of the original IDLAMS. The flexibility, dynamics, and interoperability demonstrated through this case study of an object-oriented approach have the potential to provide key technology solutions for many of the military's multiple-use goals and needs for integrated natural resource planning and ecosystem management.

  14. Social network analysis and agent-based modeling in social epidemiology

    PubMed Central

    2012-01-01

    The past five years have seen a growth in the interest in systems approaches in epidemiologic research. These approaches may be particularly appropriate for social epidemiology. Social network analysis and agent-based models (ABMs) are two approaches that have been used in the epidemiologic literature. Social network analysis involves the characterization of social networks to yield inference about how network structures may influence risk exposures among those in the network. ABMs can promote population-level inference from explicitly programmed, micro-level rules in simulated populations over time and space. In this paper, we discuss the implementation of these models in social epidemiologic research, highlighting the strengths and weaknesses of each approach. Network analysis may be ideal for understanding social contagion, as well as the influences of social interaction on population health. However, network analysis requires network data, which may sacrifice generalizability, and causal inference from current network analytic methods is limited. ABMs are uniquely suited for the assessment of health determinants at multiple levels of influence that may couple with social interaction to produce population health. ABMs allow for the exploration of feedback and reciprocity between exposures and outcomes in the etiology of complex diseases. They may also provide the opportunity for counterfactual simulation. However, appropriate implementation of ABMs requires a balance between mechanistic rigor and model parsimony, and the precision of output from complex models is limited. Social network and agent-based approaches are promising in social epidemiology, but continued development of each approach is needed. PMID:22296660

  15. Effect of individual protective behaviors on influenza transmission: an agent-based model.

    PubMed

    Karimi, Elnaz; Schmitt, Ketra; Akgunduz, Ali

    2015-09-01

    It is well established in the epidemiological literature that individual behaviors have a significant effect on the spread of infectious diseases. Agent-based models are increasingly being recognized as the next generation of epidemiological models. In this research, we use the ability of agent-based models to incorporate behavior into simulations by examining the relative importance of vaccination and social distancing, two common measures for controlling the spread of infectious diseases, with respect to seasonal influenza. We modeled health behaviour using the result of a Health Belief Model study focused on influenza. We considered a control and a treatment group to explore the effect of education on people's health-related behaviors patterns. The control group reflects the behavioral patterns of students based on their general knowledge of influenza and its interventions while the treatment group illustrates the level of behavioral changes after individuals have been educated by a health care expert. The results of this study indicate that self-initiated behaviors are successful in controlling an outbreak in a high contact rate location such as a university. Self-initiated behaviors resulted in a population attack rate decrease of 17% and a 25% reduction in the peak number of cases. The simulation also provides significant evidence for the effect of an HBM theory-based educational program to increase the rate of applying the target interventions (vaccination by 22% percent and social distancing by 41%) and consequently to control the outbreak. PMID:25578039

  16. A Component-Based FPGA Design Framework for Neuronal Ion Channel Dynamics Simulations

    PubMed Central

    Mak, Terrence S. T.; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang

    2008-01-01

    Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field Programmable Gate Array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the AMPA and NMDA synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired. PMID:17190033

  17. Multiscale Simulation as a Framework for the Enhanced Design of Nanodiamond-Polyethylenimine-based Gene Delivery

    PubMed Central

    Kim, Hansung; Man, Han Bin; Saha, Biswajit; Kopacz, Adrian M.; Lee, One-Sun; Schatz, George C.; Ho, Dean; Liu, Wing Kam

    2012-01-01

    Nanodiamonds (NDs) are emerging carbon platforms with promise as gene/drug delivery vectors for cancer therapy. Specifically, NDs functionalized with the polymer polyethylenimine (PEI) can transfect small interfering RNAs (siRNA) in vitro with high efficiency and low cytotoxicity. Here we present a modeling framework to accurately guide the design of ND-PEI gene platforms and elucidate binding mechanisms between ND, PEI, and siRNA. This is among the first ND simulations to comprehensively account for ND size, charge distribution, surface functionalization, and graphitization. The simulation results are compared with our experimental results both for PEI loading onto NDs and for siRNA (C-myc) loading onto ND-PEI for various mixing ratios. Remarkably, the model is able to predict loading trends and saturation limits for PEI and siRNA, while confirming the essential role of ND surface functionalization in mediating ND-PEI interactions. These results demonstrate that this robust framework can be a powerful tool in ND platform development, with the capacity to realistically treat other nanoparticle systems. PMID:23304428

  18. Global Simulation of Bioenergy Crop Productivity: Analytical framework and Case Study for Switchgrass

    SciTech Connect

    Nair, S. Surendran; Nichols, Jeff A. {Cyber Sciences}; Post, Wilfred M; Wang, Dali; Wullschleger, Stan D; Kline, Keith L; Wei, Yaxing; Singh, Nagendra; Kang, Shujiang

    2014-01-01

    Contemporary global assessments of the deployment potential and sustainability aspects of biofuel crops lack quantitative details. This paper describes an analytical framework capable of meeting the challenges associated with global scale agro-ecosystem modeling. We designed a modeling platform for bioenergy crops, consisting of five major components: (i) standardized global natural resources and management data sets, (ii) global simulation unit and management scenarios, (iii) model calibration and validation, (iv) high-performance computing (HPC) modeling, and (v) simulation output processing and analysis. A case study with the HPC- Environmental Policy Integrated Climate model (HPC-EPIC) to simulate a perennial bioenergy crop, switchgrass (Panicum virgatum L.) and global biomass feedstock analysis on grassland demonstrates the application of this platform. The results illustrate biomass feedstock variability of switchgrass and provide insights on how the modeling platform can be expanded to better assess sustainable production criteria and other biomass crops. Feedstock potentials on global grasslands and within different countries are also shown. Future efforts involve developing databases of productivity, implementing global simulations for other bioenergy crops (e.g. miscanthus, energycane and agave), and assessing environmental impacts under various management regimes. We anticipated this platform will provide an exemplary tool and assessment data for international communities to conduct global analysis of biofuel biomass feedstocks and sustainability.

  19. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    SciTech Connect

    Kang, Shujiang; Kline, Keith L; Nair, S. Surendran; Nichols, Dr Jeff A; Post, Wilfred M; Brandt, Craig C; Wullschleger, Stan D; Wei, Yaxing; Singh, Nagendra

    2013-01-01

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulated a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.

  20. A parallel framework for the FE-based simulation of knee joint motion.

    PubMed

    Wawro, Martin; Fathi-Torbaghan, Madjid

    2004-08-01

    We present an object-oriented framework for the finite-element (FE)-based simulation of the human knee joint motion. The FE model of the knee joint is acquired from the patients in vivo by using magnetic resonance imaging. The MRI images are converted into a three-dimensional model and finally an all-hexahedral mesh for the FE analysis is generated. The simulation environment uses nonlinear finite-element analysis (FEA) and is capable of handling contact of the model to handle the complex rolling/sliding motion of the knee joint. The software strictly follows object-oriented concepts of software engineering in order to guarantee maximum extensibility and maintainability. The final goal of this work-in-progress is the creation of a computer-based biomechanical model of the knee joint which can be used in a variety of applications, ranging from prosthesis design and treatment planning (e.g., optimal reconstruction of ruptured ligaments) over surgical simulation to impact computations in crashworthiness simulations. PMID:15311837

  1. A GIS/Simulation Framework for Assessing Change in Water Yield over Large Spatial Scales

    SciTech Connect

    Graham, R.; Hargrove, W.W.; Huff, D.D.; Nikolov, N.; Tharp, M.L.

    1999-11-13

    Recent legislation to,initiate vegetation management in the Central Sierra hydrologic region of California includes a focus on corresponding changes in water yield. This served as the impetus for developing a combined geographic information system (GIS) and simulation assessment framework. Using the existing vegetation density condition, together with proposed rules for thinning to reduce fire risk, a set of simulation model inputs were generated for examining the impact of the thinning scenario on water yield. The approach allows results to be expressed as the mean and standard deviation of change in water yield for each 1 km2 map cell that is treated. Values for groups of cells are aggregated for typical watershed units using area-weighted averaging. Wet, dry and average precipitation years were simulated over a large region. Where snow plays an important role in hydrologic processes, the simulated change in water yield was less than 0.5% of expected annual runoff for a typical water shed. Such small changes would be undetectable in the field using conventional stream flow analysis. These results suggest that use of water yield increases to help justify forest-thinning activities or offset their cost will be difficult.

  2. Self-consistently simulation of RF sheath boundary condition in BOUT + + framework

    NASA Astrophysics Data System (ADS)

    Gui, Bin; Xu, Xueqiao; Xia, Tianyang

    2015-11-01

    The effect of the RF sheath boundary condition on the edge-localized modes and the turbulent transport is simulated in this work. The work includes two parts. The first part is to calculate the equilibrium radial electric field with RF sheath boundary condition. It is known the thermal sheath or the rectified RF sheath will modify the potential in the SOL region. The modified potential induces addition shear flow in SOL. In this part, the equilibrium radial electric field across the separatrix is calculated by solving the 2D current continuity equation with sheath boundary condition, drifts and viscosity. The second part is applying the sheath boundary condition on the perturbed variables of the six-field two fluid model in BOUT + + framework. The six-field two-fluid model simulates the ELMs and turbulent transport. The sheath boundary condition is applied in this model and it aims to simulate effect of sheath boundary condition on the turbulent transport. It is found the sheath boundary plays as a sink in the plasma and suppresses the local perturbation. Based on this two work, the effect of RF sheath boundary condition on the ELMs and turbulent transport could be self-consistently simulated. Prepared by LLNL under Contract DE-AC52-07NA27344.

  3. Spatial and Temporal Simulation of Human Evolution. Methods, Frameworks and Applications

    PubMed Central

    Benguigui, Macarena; Arenas, Miguel

    2014-01-01

    Analyses of human evolution are fundamental to understand the current gradients of human diversity. In this concern, genetic samples collected from current populations together with archaeological data are the most important resources to study human evolution. However, they are often insufficient to properly evaluate a variety of evolutionary scenarios, leading to continuous debates and discussions. A commonly applied strategy consists of the use of computer simulations based on, as realistic as possible, evolutionary models, to evaluate alternative evolutionary scenarios through statistical correlations with the real data. Computer simulations can also be applied to estimate evolutionary parameters or to study the role of each parameter on the evolutionary process. Here we review the mainly used methods and evolutionary frameworks to perform realistic spatially explicit computer simulations of human evolution. Although we focus on human evolution, most of the methods and software we describe can also be used to study other species. We also describe the importance of considering spatially explicit models to better mimic human evolutionary scenarios based on a variety of phenomena such as range expansions, range shifts, range contractions, sex-biased dispersal, long-distance dispersal or admixtures of populations. We finally discuss future implementations to improve current spatially explicit simulations and their derived applications in human evolution. PMID:25132795

  4. SimPEG: An open-source framework for geophysical simulations and inverse problems

    NASA Astrophysics Data System (ADS)

    Cockett, R.; Kang, S.; Heagy, L.

    2014-12-01

    Geophysical surveys are powerful tools for obtaining information about the subsurface. Inverse modelling provides a mathematical framework for constructing a model of physical property distributions that are consistent with the data collected by these surveys. The geosciences are increasingly moving towards the integration of geological, geophysical, and hydrological information to better characterize the subsurface. This integration must span disciplines and is not only challenging scientifically, but the inconsistencies between conventions often makes implementations complicated, non-reproducible, or inefficient. We have developed an open source software package for Simulation and Parameter Estimation in Geophysics (SimPEG), which provides a generalized framework for solving geophysical forward and inverse problems. SimPEG is written entirely in Python with minimal dependencies in the hopes that it can be used both as a research tool and for education. SimPEG includes finite volume discretizations on structured and unstructured meshes, interfaces to standard numerical solver packages, convex optimization algorithms, model parameterizations, and tailored visualization routines. The framework is modular and object-oriented, which promotes real time experimentation and combination of geophysical problems and inversion methodologies. In this presentation, we will highlight a few geophysical examples, including direct-current resistivity and electromagnetics, and discuss some of the challenges and successes we encountered in developing a flexible and extensible framework. Throughout development of SimPEG we have focused on simplicity, usability, documentation, and extensive testing. By embracing a fully open source development paradigm, we hope to encourage reproducible research, cooperation, and communication to help tackle some of the inherently multidisciplinary problems that face integrated geophysical methods.

  5. URDME: a modular framework for stochastic simulation of reaction-transport processes in complex geometries

    PubMed Central

    2012-01-01

    Background Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. Results We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic

  6. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    NASA Astrophysics Data System (ADS)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  7. Application of agent-based system for bioprocess description and process improvement.

    PubMed

    Gao, Ying; Kipling, Katie; Glassey, Jarka; Willis, Mark; Montague, Gary; Zhou, Yuhong; Titchener-Hooker, Nigel J

    2010-01-01

    Modeling plays an important role in bioprocess development for design and scale-up. Predictive models can also be used in biopharmaceutical manufacturing to assist decision-making either to maintain process consistency or to identify optimal operating conditions. To predict the whole bioprocess performance, the strong interactions present in a processing sequence must be adequately modeled. Traditionally, bioprocess modeling considers process units separately, which makes it difficult to capture the interactions between units. In this work, a systematic framework is developed to analyze the bioprocesses based on a whole process understanding and considering the interactions between process operations. An agent-based approach is adopted to provide a flexible infrastructure for the necessary integration of process models. This enables the prediction of overall process behavior, which can then be applied during process development or once manufacturing has commenced, in both cases leading to the capacity for fast evaluation of process improvement options. The multi-agent system comprises a process knowledge base, process models, and a group of functional agents. In this system, agent components co-operate with each other in performing their tasks. These include the description of the whole process behavior, evaluating process operating conditions, monitoring of the operating processes, predicting critical process performance, and providing guidance to decision-making when coping with process deviations. During process development, the system can be used to evaluate the design space for process operation. During manufacture, the system can be applied to identify abnormal process operation events and then to provide suggestions as to how best to cope with the deviations. In all cases, the function of the system is to ensure an efficient manufacturing process. The implementation of the agent-based approach is illustrated via selected application scenarios, which

  8. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Estep, Donald

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  9. Agent-based modeling: a new approach for theory building in social psychology.

    PubMed

    Smith, Eliot R; Conrey, Frederica R

    2007-02-01

    Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach. PMID:18453457

  10. Re-Examining of Moffitt’s Theory of Delinquency through Agent Based Modeling

    PubMed Central

    Leaw, Jia Ning; Ang, Rebecca P.; Huan, Vivien S.; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt’s theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  11. An agent-based interaction model for Chinese personal income distribution

    NASA Astrophysics Data System (ADS)

    Zou, Yijiang; Deng, Weibing; Li, Wei; Cai, Xu

    2015-10-01

    The personal income distribution in China was studied by employing the data from China Household Income Projects (CHIP) between 1990 and 2002. It was observed that the low and middle income regions could be described by the log-normal law, while the large income region could be well fitted by the power law. To characterize these empirical findings, a stochastic interactive model with mean-field approach was discussed, and the analytic result shows that the wealth distribution is of the Pareto type. Then we explored the agent-based model on networks, in which the exchange of wealth among agents depends on their connectivity. Numerical results suggest that the wealth of agents would largely rely on their connectivity, and the Pareto index of the simulated wealth distributions is comparable to those of the empirical data. The Pareto behavior of the tails of the empirical wealth distributions is consistent with that of the 'mean-field' model, as well as numerical simulations.

  12. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery

    PubMed Central

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  13. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.

    PubMed

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  14. Re-Examining of Moffitt's Theory of Delinquency through Agent Based Modeling.

    PubMed

    Leaw, Jia Ning; Ang, Rebecca P; Huan, Vivien S; Chan, Wei Teng; Cheong, Siew Ann

    2015-01-01

    Moffitt's theory of delinquency suggests that at-risk youths can be divided into two groups, the adolescence- limited group and the life-course-persistent group, predetermined at a young age, and social interactions between these two groups become important during the adolescent years. We built an agent-based model based on the microscopic interactions Moffitt described: (i) a maturity gap that dictates (ii) the cost and reward of antisocial behavior, and (iii) agents imitating the antisocial behaviors of others more successful than themselves, to find indeed the two groups emerging in our simulations. Moreover, through an intervention simulation where we moved selected agents from one social network to another, we also found that the social network plays an important role in shaping the life course outcome. PMID:26062022

  15. Understanding virulence mechanisms in M. tuberculosis infection via a circuit-based simulation framework.

    SciTech Connect

    May, Elebeoba Eni; Oprea, Tudor I.; Joo, Jaewook; Misra, Milind; Leitao, Andrei; Faulon, Jean-Loup Michel

    2008-08-01

    Tuberculosis (TB), caused by the bacterium Mycobacterium tuberculosis (Mtb), is a growing international health crisis. Mtb is able to persist in host tissues in a non-replicating persistent (NRP) or latent state. This presents a challenge in the treatment of TB. Latent TB can re-activate in 10% of individuals with normal immune systems, higher for those with compromised immune systems. A quantitative understanding of latency-associated virulence mechanisms may help researchers develop more effective methods to battle the spread and reduce TB associated fatalities. Leveraging BioXyce's ability to simulate whole-cell and multi-cellular systems we are developing a circuit-based framework to investigate the impact of pathogenicity-associated pathways on the latency/reactivation phase of tuberculosis infection. We discuss efforts to simulate metabolic pathways that potentially impact the ability of Mtb to persist within host immune cells. We demonstrate how simulation studies can provide insight regarding the efficacy of potential anti-TB agents on biological networks critical to Mtb pathogenicity using a systems chemical biology approach

  16. GeNN: a code generation framework for accelerated brain simulations.

    PubMed

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  17. A framework for stochastic simulations and visualization of biological electron-transfer dynamics

    NASA Astrophysics Data System (ADS)

    Nakano, C. Masato; Byun, Hye Suk; Ma, Heng; Wei, Tao; El-Naggar, Mohamed Y.

    2015-08-01

    Electron transfer (ET) dictates a wide variety of energy-conversion processes in biological systems. Visualizing ET dynamics could provide key insight into understanding and possibly controlling these processes. We present a computational framework named VizBET to visualize biological ET dynamics, using an outer-membrane Mtr-Omc cytochrome complex in Shewanella oneidensis MR-1 as an example. Starting from X-ray crystal structures of the constituent cytochromes, molecular dynamics simulations are combined with homology modeling, protein docking, and binding free energy computations to sample the configuration of the complex as well as the change of the free energy associated with ET. This information, along with quantum-mechanical calculations of the electronic coupling, provides inputs to kinetic Monte Carlo (KMC) simulations of ET dynamics in a network of heme groups within the complex. Visualization of the KMC simulation results has been implemented as a plugin to the Visual Molecular Dynamics (VMD) software. VizBET has been used to reveal the nature of ET dynamics associated with novel nonequilibrium phase transitions in a candidate configuration of the Mtr-Omc complex due to electron-electron interactions.

  18. Social simulation theory: a framework to explain nurses' understanding of patients' experiences of ill-health.

    PubMed

    Nordby, Halvor

    2016-09-01

    A fundamental aim in caring practice is to understand patients' experiences of ill-health. These experiences have a qualitative content and cannot, unlike thoughts and beliefs with conceptual content, directly be expressed in words. Nurses therefore face a variety of interpretive challenges when they aim to understand patients' subjective perspectives on disease and illness. The article argues that theories on social simulation can shed light on how nurses manage to meet these challenges. The core assumption of social simulationism is that we do not understand other people by forming mental representations of how they think, but by putting ourselves in their situation in a more imaginative way. According to simulationism, any attempt to understand a patient's behavior is made on the basis of simulating what it is like to be that patient in the given context. The article argues that this approach to social interpretation can clarify how nurses manage to achieve aims of patient understanding, even when they have limited time to communicate and incomplete knowledge of patients' perspectives. Furthermore, simulation theory provides a normative framework for interpretation, in the sense that its theoretical assumptions constitute ideals for how nurses should seek to understand patients' experiences of illness. PMID:27198752

  19. GeNN: a code generation framework for accelerated brain simulations

    PubMed Central

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  20. Reservoir Modeling by Data Integration via Intermediate Spaces and Artificial Intelligence Tools in MPS Simulation Frameworks

    SciTech Connect

    Ahmadi, Rouhollah; Khamehchi, Ehsan

    2013-12-15

    Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks and fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.

  1. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  2. Economic evaluations with agent-based modelling: an introduction.

    PubMed

    Chhatwal, Jagpreet; He, Tianhua

    2015-05-01

    Agent-based modelling (ABM) is a relatively new technique, which overcomes some of the limitations of other methods commonly used for economic evaluations. These limitations include linearity, homogeneity and stationarity. Agents in ABMs are autonomous entities, who interact with each other and with the environment. ABMs provide an inductive or 'bottom-up' approach, i.e. individual-level behaviours define system-level components. ABMs have a unique property to capture emergence phenomena that otherwise cannot be predicted by the combination of individual-level interactions. In this tutorial, we discuss the basic concepts and important features of ABMs. We present a case study of an application of a simple ABM to evaluate the cost effectiveness of screening of an infectious disease. We also provide our model, which was developed using an open-source software program, NetLogo. We discuss software, resources, challenges and future research opportunities of ABMs for economic evaluations. PMID:25609398

  3. On agent-based modeling and computational social science

    PubMed Central

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  4. Hypercompetitive Environments: An Agent-based model approach

    NASA Astrophysics Data System (ADS)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  5. On agent-based modeling and computational social science.

    PubMed

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  6. Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    NASA Astrophysics Data System (ADS)

    di Clemente, Riccardo; Pietronero, Luciano

    2012-07-01

    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.

  7. Agent-Based Modeling of Noncommunicable Diseases: A Systematic Review

    PubMed Central

    Arah, Onyebuchi A.

    2015-01-01

    We reviewed the use of agent-based modeling (ABM), a systems science method, in understanding noncommunicable diseases (NCDs) and their public health risk factors. We systematically reviewed studies in PubMed, ScienceDirect, and Web of Sciences published from January 2003 to July 2014. We retrieved 22 relevant articles; each had an observational or interventional design. Physical activity and diet were the most-studied outcomes. Often, single agent types were modeled, and the environment was usually irrelevant to the studied outcome. Predictive validation and sensitivity analyses were most used to validate models. Although increasingly used to study NCDs, ABM remains underutilized and, where used, is suboptimally reported in public health studies. Its use in studying NCDs will benefit from clarified best practices and improved rigor to establish its usefulness and facilitate replication, interpretation, and application. PMID:25602871

  8. Linking agent-based models and stochastic models of financial markets.

    PubMed

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene

    2012-05-29

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086

  9. An agent-based model of group decision making in baboons.

    PubMed

    Sellers, W I; Hill, R A; Logan, B S

    2007-09-29

    We present an agent-based model of the key activities of a troop of chacma baboons (Papio hamadryas ursinus) based on the data collected at De Hoop Nature Reserve in South Africa. We analyse the predictions of the model in terms of how well it is able to duplicate the observed activity patterns of the animals and the relationship between the parameters that control the agent's decision procedure and the model's predictions. At the current stage of model development, we are able to show that across a wide range of decision parameter values, the baboons are able to achieve their energetic and social time requirements. The simulation results also show that decisions concerning movement (group action selection) have the greatest influence on the outcomes. Those cases where the model's predictions fail to agree with the observed activity patterns have highlighted key elements that were missing from the field data, and that would need to be collected in subsequent fieldwork. Based on our experience, we believe group decision making is a fertile field for future research, and agent-based modelling offers considerable scope for understanding group action selection. PMID:17428770

  10. Combining agent-based modeling and life cycle assessment for the evaluation of mobility policies.

    PubMed

    Florent, Querini; Enrico, Benetto

    2015-02-01

    This article presents agent-based modeling (ABM) as a novel approach for consequential life cycle assessment (C-LCA) of large scale policies, more specifically mobility-related policies. The approach is validated at the Luxembourgish level (as a first case study). The agent-based model simulates the car market (sales, use, and dismantling) of the population of users in the period 2013-2020, following the implementation of different mobility policies and available electric vehicles. The resulting changes in the car fleet composition as well as the hourly uses of the vehicles are then used to derive consistent LCA results, representing the consequences of the policies. Policies will have significant environmental consequences: when using ReCiPe2008, we observe a decrease of global warming, fossil depletion, acidification, ozone depletion, and photochemical ozone formation and an increase of metal depletion, ionizing radiations, marine eutrophication, and particulate matter formation. The study clearly shows that the extrapolation of LCA results for the circulating fleet at national scale following the introduction of the policies from the LCAs of single vehicles by simple up-scaling (using hypothetical deployment scenarios) would be flawed. The inventory has to be directly conducted at full scale and to this aim, ABM is indeed a promising approach, as it allows identifying and quantifying emerging effects while modeling the Life Cycle Inventory of vehicles at microscale through the concept of agents. PMID:25587896

  11. Linking agent-based models and stochastic models of financial markets

    PubMed Central

    Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene

    2012-01-01

    It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086

  12. A novel finite element framework for numerical simulation of fluidization processes and multiphase granular flow

    NASA Astrophysics Data System (ADS)

    Percival, James; Xie, Zhihua; Pavlidis, Dimitrios; Gomes, Jefferson; Pain, Christopher; Matar, Omar

    2013-11-01

    We present results from a new formulation of a numerical model for direct simulation of bed fluidization and multiphase granular flow. The model is based on a consistent application of continuous-discontinuous mixed control volume finite element methods applied to fully unstructured meshes. The unstructured mesh framework allows for both a mesh adaptive capability, modifying the computational geometry in order to bound the error in the numerical solution while maximizing computational efficiency, and a simple scripting interface embedded in the model which allows fast prototyping of correlation models and parameterizations in intercomparison experiments. The model is applied to standard test problems for fluidized beds. EPSRC Programme Grant EP/K003976/1.

  13. The structure of disaster resilience: a framework for simulations and policy recommendations

    NASA Astrophysics Data System (ADS)

    Edwards, J. H. Y.

    2015-04-01

    In this era of rapid climate change there is an urgent need for interdisciplinary collaboration and understanding in the study of what determines resistance to disasters and recovery speed. This paper is an economist's contribution to that effort. It traces the entrance of the word "resilience" from ecology into the social science literature on disasters, provides a formal economic definition of resilience that can be used in mathematical modeling, incorporates this definition into a multilevel model that suggests appropriate policy roles and targets at each level, and draws on the recent empirical literature on the economics of disaster, searching for policy handles that can stimulate higher resilience. On the whole it provides a framework for simulations and for formulating disaster resilience policies.

  14. The structure of disaster resilience: a framework for simulations and policy recommendations

    NASA Astrophysics Data System (ADS)

    Edwards, J. H. Y.

    2014-09-01

    In this era of rapid climate change there is an urgent need for interdisciplinary collaboration and understanding in the study of what determines resistance to disasters and recovery speed. This paper is an economist's contribution to that effort. It traces the entrance of the word "resilience" from ecology into the social science literature on disasters, provides a formal economic definition of resilience that can be used in mathematical modeling, incorporates this definition into a multilevel model that suggests appropriate policy roles and targets at each level, and draws on the recent empirical literature on the economics of disaster searching for policy handles that can stimulate higher resilience. On the whole it provides a framework for simulations and for formulating disaster resilience policies.

  15. A framework to quantify uncertainty in simulations of oil transport in the ocean

    NASA Astrophysics Data System (ADS)

    Gonçalves, Rafael C.; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Chassignet, Eric; Knio, Omar M.

    2016-04-01

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model's output to be presented in a probabilistic framework so that the model's predictions reflect the uncertainty in the model's input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model's uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable.

  16. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication

  17. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    NASA Astrophysics Data System (ADS)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-06-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication

  18. Molecular dynamics simulation of framework flexibility effects on noble gas diffusion in HKUST-1 and ZIF-8

    SciTech Connect

    Parkes, Marie V.; Demir, Hakan; Teich-McGoldrick, Stephanie L.; Sholl, David S.; Greathouse, Jeffery A.; Allendorf, Mark D.

    2014-03-28

    Molecular dynamics simulations were used to investigate trends in noble gas (Ar, Kr, Xe) diffusion in the metal-organic frameworks HKUST-1 and ZIF-8. Diffusion occurs primarily through inter-cage jump events, with much greater diffusion of guest atoms in HKUST-1 compared to ZIF-8 due to the larger cage and window sizes in the former. We compare diffusion coefficients calculated for both rigid and flexible frameworks. For rigid framework simulations, in which the framework atoms were held at their crystallographic or geometry optimized coordinates, sometimes dramatic differences in guest diffusion were seen depending on the initial framework structure or the choice of framework force field parameters. When framework flexibility effects were included, argon and krypton diffusion increased significantly compared to rigid-framework simulations using general force field parameters. Additionally, for argon and krypton in ZIF-8, guest diffusion increased with loading, demonstrating that guest-guest interactions between cages enhance inter-cage diffusion. No inter-cage jump events were seen for xenon atoms in ZIF-8 regardless of force field or initial structure, and the loading dependence of xenon diffusion in HKUST-1 is different for rigid and flexible frameworks. Diffusion of krypton and xenon in HKUST-1 depends on two competing effects: the steric effect that decreases diffusion as loading increases, and the “small cage effect” that increases diffusion as loading increases. Finally, a detailed analysis of the window size in ZIF-8 reveals that the window increases beyond its normal size to permit passage of a (nominally) larger krypton atom.

  19. Progress report for FACETS (Framework Application for Core-Edge Transport Simulations): C.S. SAP

    SciTech Connect

    Epperly, T W

    2008-10-01

    The mission of the Computer Science Scientific Application Partnership (C.S. SAP) at LLNL is to develop and apply leading-edge scientific component technology to FACETS software. Contributions from LLNL's fusion energy program staff towards the underlying physics modules are described in a separate report. FACETS uses component technology to combine selectively multiple physics and solver software modules written in different languages by different institutions together in an tightly-integrated, parallel computing framework for Tokamak reactor modeling. In the past fiscal year, the C.S. SAP has focused on two primary tasks: applying Babel to connect UEDGE into the FACETS framework through UEDGE's existing Python interface and developing a next generation componentization strategy for UEDGE which avoids the use of Python. The FACETS project uses Babel to solve its language interoperability challenges. Specific accomplishments for the year include: (1) Refined SIDL interfaces for UEDGE to meet satisfy the standard interfaces required by FACETS for all physics modules. This required consensus building between framework and UEDGE developers. (2) Wrote prototype C++ driver for UEDGE to demonstrate how UEDGE can be called from C++ using Babel. (3) Supported the FACETS project by adding new features to Babel such as release number tagging, porting to new machines, and adding new configuration options. Babel modifications were delivered to FACETS by testing and publishing development snapshots in the projects software repository. (4) Assisted Tech-X Corporation in testing and debugging of a high level build system for the complete FACETS tool chain--the complete list of third-party software libraries that FACETS depends on directly or indirectly (e.g., MPI, HDF5, PACT, etc.). (5) Designed and implemented a new approach to wrapping UEDGE as a FACETS component without requiring Python. To get simulation results as soon as possible, our initial connection from the FACETS

  20. A discrete element based simulation framework to investigate particulate spray deposition processes

    SciTech Connect

    Mukherjee, Debanjan Zohdi, Tarek I.

    2015-06-01

    This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid–particle interactions, particle–surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid–particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  1. A framework for stochastic simulation of distribution practices for hotel reservations

    SciTech Connect

    Halkos, George E.; Tsilika, Kyriaki D.

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  2. A discrete element based simulation framework to investigate particulate spray deposition processes

    NASA Astrophysics Data System (ADS)

    Mukherjee, Debanjan; Zohdi, Tarek I.

    2015-06-01

    This work presents a computer simulation framework based on discrete element method to analyze manufacturing processes that comprise a loosely flowing stream of particles in a carrier fluid being deposited on a target surface. The individual particulate dynamics under the combined action of particle collisions, fluid-particle interactions, particle-surface contact and adhesive interactions is simulated, and aggregated to obtain global system behavior. A model for deposition which incorporates the effect of surface energy, impact velocity and particle size, is developed. The fluid-particle interaction is modeled using appropriate spray nozzle gas velocity distributions and a one-way coupling between the phases. It is found that the particle response times and the release velocity distribution of particles have a combined effect on inter-particle collisions during the flow along the spray. It is also found that resolution of the particulate collisions close to the target surface plays an important role in characterizing the trends in the deposit pattern. Analysis of the deposit pattern using metrics defined from the particle distribution on the target surface is provided to characterize the deposition efficiency, deposit size, and scatter due to collisions.

  3. A Semi-Automatic Coronary Artery Segmentation Framework Using Mechanical Simulation.

    PubMed

    Cai, Ken; Yang, Rongqian; Li, Lihua; Ou, Shanxing; Chen, Yuke; Dou, Jianhong

    2015-10-01

    CVD (cardiovascular disease) is one of the biggest threats to human beings nowadays. An early and quantitative diagnosis of CVD is important in extending lifespan and improving people's life quality. Coronary artery stenosis can prevent CVD. To diagnose the degree of stenosis, the inner diameter of coronary artery needs to be measured. To achieve such measurement, the coronary artery is segmented by using a method that is based on morphology and the continuity between computed tomography image slices. A centerline extraction method based on mechanical simulation is proposed. This centerline extraction method can figure out a basic framework of the coronary artery by simulating pixel dots of the artery image into mass points. Such mass points have tensile forces, with which the outer pixel dots can be drawn to the center. Subsequently, the centerline of the coronary artery can be outlined by using the local line-fitting method. Finally, the nearest point method is adopted to measure the inner diameter. Experimental results showed that the methods proposed in this paper can precisely extract the centerline of the coronary artery and can accurately measure its inner diameter, thereby providing a basis for quantitative diagnosis of coronary artery stenosis. PMID:26310950

  4. A Systematic Framework for Molecular Dynamics Simulations of Protein Post-Translational Modifications

    PubMed Central

    Grandits, Melanie; Oostenbrink, Chris; Zagrovic, Bojan

    2013-01-01

    By directly affecting structure, dynamics and interaction networks of their targets, post-translational modifications (PTMs) of proteins play a key role in different cellular processes ranging from enzymatic activation to regulation of signal transduction to cell-cycle control. Despite the great importance of understanding how PTMs affect proteins at the atomistic level, a systematic framework for treating post-translationally modified amino acids by molecular dynamics (MD) simulations, a premier high-resolution computational biology tool, has never been developed. Here, we report and validate force field parameters (GROMOS 45a3 and 54a7) required to run and analyze MD simulations of more than 250 different types of enzymatic and non-enzymatic PTMs. The newly developed GROMOS 54a7 parameters in particular exhibit near chemical accuracy in matching experimentally measured hydration free energies (RMSE = 4.2 kJ/mol over the validation set). Using this tool, we quantitatively show that the majority of PTMs greatly alter the hydrophobicity and other physico-chemical properties of target amino acids, with the extent of change in many cases being comparable to the complete range spanned by native amino acids. PMID:23874192

  5. A systematic framework for molecular dynamics simulations of protein post-translational modifications.

    PubMed

    Petrov, Drazen; Margreitter, Christian; Grandits, Melanie; Oostenbrink, Chris; Zagrovic, Bojan

    2013-01-01

    By directly affecting structure, dynamics and interaction networks of their targets, post-translational modifications (PTMs) of proteins play a key role in different cellular processes ranging from enzymatic activation to regulation of signal transduction to cell-cycle control. Despite the great importance of understanding how PTMs affect proteins at the atomistic level, a systematic framework for treating post-translationally modified amino acids by molecular dynamics (MD) simulations, a premier high-resolution computational biology tool, has never been developed. Here, we report and validate force field parameters (GROMOS 45a3 and 54a7) required to run and analyze MD simulations of more than 250 different types of enzymatic and non-enzymatic PTMs. The newly developed GROMOS 54a7 parameters in particular exhibit near chemical accuracy in matching experimentally measured hydration free energies (RMSE=4.2 kJ/mol over the validation set). Using this tool, we quantitatively show that the majority of PTMs greatly alter the hydrophobicity and other physico-chemical properties of target amino acids, with the extent of change in many cases being comparable to the complete range spanned by native amino acids. PMID:23874192

  6. A heterogeneous and parallel computing framework for high-resolution hydrodynamic simulations

    NASA Astrophysics Data System (ADS)

    Smith, Luke; Liang, Qiuhua

    2015-04-01

    Shock-capturing hydrodynamic models are now widely applied in the context of flood risk assessment and forecasting, accurately capturing the behaviour of surface water over ground and within rivers. Such models are generally explicit in their numerical basis, and can be computationally expensive; this has prohibited full use of high-resolution topographic data for complex urban environments, now easily obtainable through airborne altimetric surveys (LiDAR). As processor clock speed advances have stagnated in recent years, further computational performance gains are largely dependent on the use of parallel processing. Heterogeneous computing architectures (e.g. graphics processing units or compute accelerator cards) provide a cost-effective means of achieving high throughput in cases where the same calculation is performed with a large input dataset. In recent years this technique has been applied successfully for flood risk mapping, such as within the national surface water flood risk assessment for the United Kingdom. We present a flexible software framework for hydrodynamic simulations across multiple processors of different architectures, within multiple computer systems, enabled using OpenCL and Message Passing Interface (MPI) libraries. A finite-volume Godunov-type scheme is implemented using the HLLC approach to solving the Riemann problem, with optional extension to second-order accuracy in space and time using the MUSCL-Hancock approach. The framework is successfully applied on personal computers and a small cluster to provide considerable improvements in performance. The most significant performance gains were achieved across two servers, each containing four NVIDIA GPUs, with a mix of K20, M2075 and C2050 devices. Advantages are found with respect to decreased parametric sensitivity, and thus in reducing uncertainty, for a major fluvial flood within a large catchment during 2005 in Carlisle, England. Simulations for the three-day event could be performed

  7. Evaluation of a 4D cone-beam CT reconstruction approach using a simulation framework.

    PubMed

    Hartl, Alexander; Yaniv, Ziv

    2009-01-01

    Current image-guided navigation systems for thoracic abdominal interventions utilize three dimensional (3D) images acquired at breath-hold. As a result they can only provide guidance at a specific point in the respiratory cycle. The intervention is thus performed in a gated manner, with the physician advancing only when the patient is at the same respiratory cycle in which the 3D image was acquired. To enable a more continuous workflow we propose to use 4D image data. We describe an approach to constructing a set of 4D images from a diagnostic CT acquired at breath-hold and a set of intraoperative cone-beam CT (CBCT) projection images acquired while the patient is freely breathing. Our approach is based on an initial reconstruction of a gated 4D CBCT data set. The 3D CBCT images for each respiratory phase are then non-rigidly registered to the diagnostic CT data. Finally the diagnostic CT is deformed based on the registration results, providing a 4D data set with sufficient quality for navigation purposes. In this work we evaluate the proposed reconstruction approach using a simulation framework. A 3D CBCT dataset of an anthropomorphic phantom is deformed using internal motion data acquired from an animal model to create a ground truth 4D CBCT image. Simulated projection images are then created from the 4D image and the known CBCT scan parameters. Finally, the original 3D CBCT and the simulated X-ray images are used as input to our reconstruction method. The resulting 4D data set is then compared to the known ground truth by normalized cross correlation(NCC). We show that the deformed diagnostic CTs are of better quality than the gated reconstructions with a mean NCC value of 0.94 versus a mean 0.81 for the reconstructions. PMID:19964143

  8. Molecular dynamics simulation of framework flexibility effects on noble gas diffusion in HKUST-1 and ZIF-8

    DOE PAGESBeta

    Parkes, Marie V.; Demir, Hakan; Teich-McGoldrick, Stephanie L.; Sholl, David S.; Greathouse, Jeffery A.; Allendorf, Mark D.

    2014-03-28

    Molecular dynamics simulations were used to investigate trends in noble gas (Ar, Kr, Xe) diffusion in the metal-organic frameworks HKUST-1 and ZIF-8. Diffusion occurs primarily through inter-cage jump events, with much greater diffusion of guest atoms in HKUST-1 compared to ZIF-8 due to the larger cage and window sizes in the former. We compare diffusion coefficients calculated for both rigid and flexible frameworks. For rigid framework simulations, in which the framework atoms were held at their crystallographic or geometry optimized coordinates, sometimes dramatic differences in guest diffusion were seen depending on the initial framework structure or the choice of frameworkmore » force field parameters. When framework flexibility effects were included, argon and krypton diffusion increased significantly compared to rigid-framework simulations using general force field parameters. Additionally, for argon and krypton in ZIF-8, guest diffusion increased with loading, demonstrating that guest-guest interactions between cages enhance inter-cage diffusion. No inter-cage jump events were seen for xenon atoms in ZIF-8 regardless of force field or initial structure, and the loading dependence of xenon diffusion in HKUST-1 is different for rigid and flexible frameworks. Diffusion of krypton and xenon in HKUST-1 depends on two competing effects: the steric effect that decreases diffusion as loading increases, and the “small cage effect” that increases diffusion as loading increases. Finally, a detailed analysis of the window size in ZIF-8 reveals that the window increases beyond its normal size to permit passage of a (nominally) larger krypton atom.« less

  9. Evolving Nutritional Strategies in the Presence of Competition: A Geometric Agent-Based Model

    PubMed Central

    Senior, Alistair M.; Charleston, Michael A.; Lihoreau, Mathieu; Buhl, Jerome; Raubenheimer, David; Simpson, Stephen J.

    2015-01-01

    Access to nutrients is a key factor governing development, reproduction and ultimately fitness. Within social groups, contest-competition can fundamentally affect nutrient access, potentially leading to reproductive asymmetry among individuals. Previously, agent-based models have been combined with the Geometric Framework of nutrition to provide insight into how nutrition and social interactions affect one another. Here, we expand this modelling approach by incorporating evolutionary algorithms to explore how contest-competition over nutrient acquisition might affect the evolution of animal nutritional strategies. Specifically, we model tolerance of nutrient excesses and deficits when ingesting nutritionally imbalanced foods, which we term ‘nutritional latitude’; a higher degree of nutritional latitude constitutes a higher tolerance of nutritional excess and deficit. Our results indicate that a transition between two alternative strategies occurs at moderate to high levels of competition. When competition is low, individuals display a low level of nutritional latitude and regularly switch foods in search of an optimum. When food is scarce and contest-competition is intense, high nutritional latitude appears optimal, and individuals continue to consume an imbalanced food for longer periods before attempting to switch to an alternative. However, the relative balance of nutrients within available foods also strongly influences at what levels of competition, if any, transitions between these two strategies occur. Our models imply that competition combined with reproductive skew in social groups can play a role in the evolution of diet breadth. We discuss how the integration of agent-based, nutritional and evolutionary modelling may be applied in future studies to further understand the evolution of nutritional strategies across social and ecological contexts. PMID:25815976

  10. The Evolution of Cooperation in Managed Groundwater Systems: An Agent-Based Modelling Approach

    NASA Astrophysics Data System (ADS)

    Castilla Rho, J. C.; Mariethoz, G.; Rojas, R. F.; Andersen, M. S.; Kelly, B. F.; Holley, C.

    2014-12-01

    Human interactions with groundwater systems often exhibit complex features that hinder the sustainable management of the resource. This leads to costly and persistent conflicts over groundwater at the catchment scale. One possible way to address these conflicts is by gaining a better understanding of how social and groundwater dynamics coevolve using agent-based models (ABM). Such models allow exploring 'bottom-up' solutions (i.e., self-organised governance systems), where the behaviour of individual agents (e.g., farmers) results in the emergence of mutual cooperation among groundwater users. There is significant empirical evidence indicating that this kind of 'bottom-up' approach may lead to more enduring and sustainable outcomes, compared to conventional 'top-down' strategies such as centralized control and water right schemes (Ostrom 1990). New modelling tools are needed to study these concepts systematically and efficiently. Our model uses a conceptual framework to study cooperation and the emergence of social norms as initially proposed by Axelrod (1986), which we adapted to groundwater management. We developed an ABM that integrates social mechanisms and the physics of subsurface flow. The model explicitly represents feedback between groundwater conditions and social dynamics, capturing the spatial structure of these interactions and the potential effects on cooperation levels in an agricultural setting. Using this model, we investigate a series of mechanisms that may trigger norms supporting cooperative strategies, which can be sustained and become stable over time. For example, farmers in a self-monitoring community can be more efficient at achieving the objective of sustainable groundwater use than government-imposed regulation. Our coupled model thus offers a platform for testing new schemes promoting cooperation and improved resource use, which can be used as a basis for policy design. Importantly, we hope to raise awareness of agent-based modelling as

  11. Estimating Impacts of Climate Change Policy on Land Use: An Agent-Based Modelling Approach

    PubMed Central

    2015-01-01

    Agriculture is important to New Zealand’s economy. Like other primary producers, New Zealand strives to increase agricultural output while maintaining environmental integrity. Utilising modelling to explore the economic, environmental and land use impacts of policy is critical to understand the likely effects on the sector. Key deficiencies within existing land use and land cover change models are the lack of heterogeneity in farmers and their behaviour, the role that social networks play in information transfer, and the abstraction of the global and regional economic aspects within local-scale approaches. To resolve these issues we developed the Agent-based Rural Land Use New Zealand model. The model utilises a partial equilibrium economic model and an agent-based decision-making framework to explore how the cumulative effects of individual farmer’s decisions affect farm conversion and the resulting land use at a catchment scale. The model is intended to assist in the development of policy to shape agricultural land use intensification in New Zealand. We illustrate the model, by modelling the impact of a greenhouse gas price on farm-level land use, net revenue, and environmental indicators such as nutrient losses and soil erosion for key enterprises in the Hurunui and Waiau catchments of North Canterbury in New Zealand. Key results from the model show that farm net revenue is estimated to increase over time regardless of the greenhouse gas price. Net greenhouse gas emissions are estimated to decline over time, even under a no GHG price baseline, due to an expansion of forestry on low productivity land. Higher GHG prices provide a greater net reduction of emissions. While social and geographic network effects have minimal impact on net revenue and environmental outputs for the catchment, they do have an effect on the spatial arrangement of land use and in particular the clustering of enterprises. PMID:25996591

  12. An agent based multi-optional model for the diffusion of innovations

    NASA Astrophysics Data System (ADS)

    Laciana, Carlos E.; Oteiza-Aguirre, Nicolás

    2014-01-01

    We propose a model for the diffusion of several products competing in a common market based on the generalization of the Ising model of statistical mechanics (Potts model). Using an agent based implementation we analyze two problems: (i) a three options case, i.e. to adopt a product A, a product B, or non-adoption and (ii) a four option case, i.e. the adoption of product A, product B, both, or none. In the first case we analyze a launching strategy for one of the two products, which delays its launching with the objective of competing with improvements. Market shares reached by each product are then estimated at market saturation. Finally, simulations are carried out with varying degrees of social network topology, uncertainty, and population homogeneity.

  13. Agent based model of effects of task allocation strategies in flat organizations

    NASA Astrophysics Data System (ADS)

    Sobkowicz, Pawel

    2016-09-01

    A common practice in many organizations is to pile the work on the best performers. It is easy to implement by the management and, despite the apparent injustice, appears to be working in many situations. In our work we present a simple agent based model, constructed to simulate this practice and to analyze conditions under which the overall efficiency of the organization (for example measured by the backlog of unresolved issues) breaks down, due to the cumulative effect of the individual overloads. The model confirms that the strategy mentioned above is, indeed, rational: it leads to better global results than an alternative one, using equal workload distribution among all workers. The presented analyses focus on the behavior of the organizations close to the limit of the maximum total throughput and provide results for the growth of the unprocessed backlog in several situations, as well as suggestions related to avoiding such buildup.

  14. Agent-Based Model Approach to Complex Phenomena in Real Economy

    NASA Astrophysics Data System (ADS)

    Iyetomi, H.; Aoyama, H.; Fujiwara, Y.; Ikeda, Y.; Souma, W.

    An agent-based model for firms' dynamics is developed. The model consists of firm agents with identical characteristic parameters and a bank agent. Dynamics of those agents are described by their balance sheets. Each firm tries to maximize its expected profit with possible risks in market. Infinite growth of a firm directed by the ``profit maximization" principle is suppressed by a concept of ``going concern". Possibility of bankruptcy of firms is also introduced by incorporating a retardation effect of information on firms' decision. The firms, mutually interacting through the monopolistic bank, become heterogeneous in the course of temporal evolution. Statistical properties of firms' dynamics obtained by simulations based on the model are discussed in light of observations in the real economy.

  15. Combining Computational Fluid Dynamics and Agent-Based Modeling: A New Approach to Evacuation Planning

    PubMed Central

    Epstein, Joshua M.; Pankajakshan, Ramesh; Hammond, Ross A.

    2011-01-01

    We introduce a novel hybrid of two fields—Computational Fluid Dynamics (CFD) and Agent-Based Modeling (ABM)—as a powerful new technique for urban evacuation planning. CFD is a predominant technique for modeling airborne transport of contaminants, while ABM is a powerful approach for modeling social dynamics in populations of adaptive individuals. The hybrid CFD-ABM method is capable of simulating how large, spatially-distributed populations might respond to a physically realistic contaminant plume. We demonstrate the overall feasibility of CFD-ABM evacuation design, using the case of a hypothetical aerosol release in Los Angeles to explore potential effectiveness of various policy regimes. We conclude by arguing that this new approach can be powerfully applied to arbitrary population centers, offering an unprecedented preparedness and catastrophic event response tool. PMID:21687788

  16. Endogenizing geopolitical boundaries with agent-based modeling.

    PubMed

    Cederman, Lars-Erik

    2002-05-14

    Agent-based modeling promises to overcome the reification of actors. Whereas this common, but limiting, assumption makes a lot of sense during periods characterized by stable actor boundaries, other historical junctures, such as the end of the Cold War, exhibit far-reaching and swift transformations of actors' spatial and organizational existence. Moreover, because actors cannot be assumed to remain constant in the long run, analysis of macrohistorical processes virtually always requires "sociational" endogenization. This paper presents a series of computational models, implemented with the software package REPAST, which trace complex macrohistorical transformations of actors be they hierarchically organized as relational networks or as collections of symbolic categories. With respect to the former, dynamic networks featuring emergent compound actors with agent compartments represented in a spatial grid capture organizational domination of the territorial state. In addition, models of "tagged" social processes allows the analyst to show how democratic states predicate their behavior on categorical traits. Finally, categorical schemata that select out politically relevant cultural traits in ethnic landscapes formalize a constructivist notion of national identity in conformance with the qualitative literature on nationalism. This "finite-agent method", representing both states and nations as higher-level structures superimposed on a lower-level grid of primitive agents or cultural traits, avoids reification of agency. Furthermore, it opens the door to explicit analysis of entity processes, such as the integration and disintegration of actors as well as boundary transformations. PMID:12011409

  17. Endogenizing geopolitical boundaries with agent-based modeling

    PubMed Central

    Cederman, Lars-Erik

    2002-01-01

    Agent-based modeling promises to overcome the reification of actors. Whereas this common, but limiting, assumption makes a lot of sense during periods characterized by stable actor boundaries, other historical junctures, such as the end of the Cold War, exhibit far-reaching and swift transformations of actors' spatial and organizational existence. Moreover, because actors cannot be assumed to remain constant in the long run, analysis of macrohistorical processes virtually always requires “sociational” endogenization. This paper presents a series of computational models, implemented with the software package REPAST, which trace complex macrohistorical transformations of actors be they hierarchically organized as relational networks or as collections of symbolic categories. With respect to the former, dynamic networks featuring emergent compound actors with agent compartments represented in a spatial grid capture organizational domination of the territorial state. In addition, models of “tagged” social processes allows the analyst to show how democratic states predicate their behavior on categorical traits. Finally, categorical schemata that select out politically relevant cultural traits in ethnic landscapes formalize a constructivist notion of national identity in conformance with the qualitative literature on nationalism. This “finite-agent method”, representing both states and nations as higher-level structures superimposed on a lower-level grid of primitive agents or cultural traits, avoids reification of agency. Furthermore, it opens the door to explicit analysis of entity processes, such as the integration and disintegration of actors as well as boundary transformations. PMID:12011409

  18. Dynamic calibration of agent-based models using data assimilation.

    PubMed

    Ward, Jonathan A; Evans, Andrew J; Malleson, Nicolas S

    2016-04-01

    A widespread approach to investigating the dynamical behaviour of complex social systems is via agent-based models (ABMs). In this paper, we describe how such models can be dynamically calibrated using the ensemble Kalman filter (EnKF), a standard method of data assimilation. Our goal is twofold. First, we want to present the EnKF in a simple setting for the benefit of ABM practitioners who are unfamiliar with it. Second, we want to illustrate to data assimilation experts the value of using such methods in the context of ABMs of complex social systems and the new challenges these types of model present. We work towards these goals within the context of a simple question of practical value: how many people are there in Leeds (or any other major city) right now? We build a hierarchy of exemplar models that we use to demonstrate how to apply the EnKF and calibrate these using open data of footfall counts in Leeds. PMID:27152214

  19. Agent-based modelling of consumer energy choices

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Henry, Adam Douglas

    2016-06-01

    Strategies to mitigate global climate change should be grounded in a rigorous understanding of energy systems, particularly the factors that drive energy demand. Agent-based modelling (ABM) is a powerful tool for representing the complexities of energy demand, such as social interactions and spatial constraints. Unlike other approaches for modelling energy demand, ABM is not limited to studying perfectly rational agents or to abstracting micro details into system-level equations. Instead, ABM provides the ability to represent behaviours of energy consumers -- such as individual households -- using a range of theories, and to examine how the interaction of heterogeneous agents at the micro-level produces macro outcomes of importance to the global climate, such as the adoption of low-carbon behaviours and technologies over space and time. We provide an overview of ABM work in the area of consumer energy choices, with a focus on identifying specific ways in which ABM can improve understanding of both fundamental scientific and applied aspects of the demand side of energy to aid the design of better policies and programmes. Future research needs for improving the practice of ABM to better understand energy demand are also discussed.

  20. Domination and evolution in agent based model of an economy

    NASA Astrophysics Data System (ADS)

    Kazmi, Syed S.

    We introduce Agent Based Model of a pure exchange economy and a simple economy that includes production, consumption and distributions. Markets are described by Edgeworth Exchange in both models. Trades are binary bilateral trades at prices that are set in each trade. We found that the prices converge over time to a value that is not the standard Equilibrium value given by the Walrasian Tattonement fiction. The average price, and the distributions of Wealth, depends on the degree of Domination (persuasive power) we introduced based on differentials in trading "leverage" due to wealth differences. The full economy model is allowed to evolve by replacement of agents that do not survive with agents having random properties. We found that, depending upon the average productivity compared to the average consumption, very different kinds of behavior emerged. The Economy as a whole reaches a steady state by the population adapting to the conditions of productivity and consumption. Correlations develop in a population between what would be for each individual a random assignment of Productivity, Labor power, Wealth, and Preferences. The population adapts to the economic environment by development of these Correlations and without any learning process. We see signs of emerging social structure as a result of necessity of survival.

  1. Agent-based copyright protection architecture for online electronic publishing

    NASA Astrophysics Data System (ADS)

    Yi, Xun; Kitazawa, S.; Okamoto, Ejii; Wang, Xiao F.; Lam, KwokYan; Tu, S.

    1999-04-01

    Electronic publishing faces one major technical and economic challenge, i.e., how to prevent individuals from easily copying and illegally distributing electronic documents. Conventional cryptographic systems permit only valid key- holders access to encrypted data, but once such data is decrypted there is no way to track its reproduction or retransmission. Therefore, they provide little protection against data privacy, in which a publisher is confronted with unauthorized reproduction of information. In this paper, we explore the use of intelligent agent, digital watermark and cryptographic techniques to discourage the distribution of illegal electronic copies and propose an agent-based strategy to protect the copyright of on-line electronic publishing. In fact, it is impossible to develop an absolute secure copyright protection architecture for on-line electronic publishing which can prevent a malicious customer from spending a great deal of efforts on analyzing the software and finally obtaining the plaintext of the encrypted electronic document. Our work in this paper aims at making the value of analyzing agent and removing watermark to be much greater than that of the electronic document itself.

  2. Agent-based reasoning for distributed multi-INT analysis

    NASA Astrophysics Data System (ADS)

    Inchiosa, Mario E.; Parker, Miles T.; Perline, Richard

    2006-05-01

    Fully exploiting the intelligence community's exponentially growing data resources will require computational approaches differing radically from those currently available. Intelligence data is massive, distributed, and heterogeneous. Conventional approaches requiring highly structured and centralized data will not meet this challenge. We report on a new approach, Agent-Based Reasoning (ABR). In NIST evaluations, the use of ABR software tripled analysts' solution speed, doubled accuracy, and halved perceived difficulty. ABR makes use of populations of fine-grained, locally interacting agents that collectively reason about intelligence scenarios in a self-organizing, "bottom-up" process akin to those found in biological and other complex systems. Reproduction rules allow agents to make inferences from multi-INT data, while movement rules organize information and optimize reasoning. Complementary deterministic and stochastic agent behaviors enhance reasoning power and flexibility. Agent interaction via small-world networks - such as are found in nervous systems, social networks, and power distribution grids - dramatically increases the rate of discovering intelligence fragments that usefully connect to yield new inferences. Small-world networks also support the distributed processing necessary to address intelligence community data challenges. In addition, we have found that ABR pre-processing can boost the performance of commercial text clustering software. Finally, we have demonstrated interoperability with Knowledge Engineering systems and seen that reasoning across diverse data sources can be a rich source of inferences.

  3. Dynamic calibration of agent-based models using data assimilation

    PubMed Central

    Ward, Jonathan A.; Evans, Andrew J.; Malleson, Nicolas S.

    2016-01-01

    A widespread approach to investigating the dynamical behaviour of complex social systems is via agent-based models (ABMs). In this paper, we describe how such models can be dynamically calibrated using the ensemble Kalman filter (EnKF), a standard method of data assimilation. Our goal is twofold. First, we want to present the EnKF in a simple setting for the benefit of ABM practitioners who are unfamiliar with it. Second, we want to illustrate to data assimilation experts the value of using such methods in the context of ABMs of complex social systems and the new challenges these types of model present. We work towards these goals within the context of a simple question of practical value: how many people are there in Leeds (or any other major city) right now? We build a hierarchy of exemplar models that we use to demonstrate how to apply the EnKF and calibrate these using open data of footfall counts in Leeds. PMID:27152214

  4. A Framework for the Abstraction of Mesoscale Modeling for Weather Simulation

    NASA Astrophysics Data System (ADS)

    Limpasuvan, V.; Ujcich, B. E.

    2009-12-01

    Widely disseminated weather forecast results (e. g. from various national centers and private companies) are useful for typical users in gauging future atmospheric disturbances. However, these canonical forecasts may not adequately meet the needs of end-users in the various scientific fields since a predetermined model, as structured by the model administrator, produces these forecasts. To perform his/her own successful forecasts, a user faces a steep learning curve involving the collection of initial condition data (e.g. radar, satellite, and reanalyses) and operation of a suitable model (and associated software/computing). In this project, we develop an intermediate (prototypical) software framework and a web-based front-end interface that allow for the abstraction of an advanced weather model upon which the end-user can perform customizable forecasts and analyses. Having such an accessible, front-end interface for a weather model can benefit educational programs at the secondary school and undergraduate level, scientific research in the fields like fluid dynamics and meteorology, and the general public. In all cases, our project allows the user to generate a localized domain of choice, run the desired forecast on a remote high-performance computer cluster, and visually see the results. For instance, an undergraduate science curriculum could incorporate the resulting weather forecast performed under this project in laboratory exercises. Scientific researchers and graduate students would be able to readily adjust key prognostic variables in the simulation within this project’s framework. The general public within the contiguous United States could also run a simplified version of the project’s software with adjustments in forecast clarity (spatial resolution) and region size (domain). Special cases of general interests, in which a detailed forecast may be required, would be over areas of possible strong weather activities.

  5. Evolutionary Agent-based Models to design distributed water management strategies

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Castelletti, A.; Reed, P. M.

    2012-12-01

    There is growing awareness in the scientific community that the traditional centralized approach to water resources management, as described in much of the water resources literature, provides an ideal optimal solution, which is certainly useful to quantify the best physically achievable performance, but is generally inapplicable. Most real world water resources management problems are indeed characterized by the presence of multiple, distributed and institutionally-independent decision-makers. Multi-Agent Systems provide a potentially more realistic alternative framework to model multiple and self-interested decision-makers in a credible context. Each decision-maker can be represented by an agent who, being self-interested, acts according to local objective functions and produces negative externalities on system level objectives. Different levels of coordination can potentially be included in the framework by designing coordination mechanisms to drive the current decision-making structure toward the global system efficiency. Yet, the identification of effective coordination strategies can be particularly complex in modern institutional contexts and current practice is dependent on largely ad-hoc coordination strategies. In this work we propose a novel Evolutionary Agent-based Modeling (EAM) framework that enables a mapping of fully uncoordinated and centrally coordinated solutions into their relative "many-objective" tradeoffs using multiobjective evolutionary algorithms. Then, by analysing the conflicts between local individual agent and global system level objectives it is possible to more fully understand the causes, consequences, and potential solution strategies for coordination failures. Game-theoretic criteria have value for identifying the most interesting alternatives from a policy making point of view as well as the coordination mechanisms that can be applied to obtain these interesting solutions. The proposed approach is numerically tested on a

  6. A MONTE-CARLO SIMULATION FRAMEWORK FOR JOINT OPTIMISATION OF IMAGE QUALITY AND PATIENT DOSE IN DIGITAL PAEDIATRIC RADIOGRAPHY.

    PubMed

    Menser, Bernd; Manke, Dirk; Mentrup, Detlef; Neitzel, Ulrich

    2016-06-01

    In paediatric radiography, according to the as low as reasonably achievable (ALARA) principle, the imaging task should be performed with the lowest possible radiation dose. This paper describes a Monte-Carlo simulation framework for dose optimisation of imaging parameters in digital paediatric radiography. Patient models with high spatial resolution and organ segmentation enable the simultaneous evaluation of image quality and patient dose on the same simulated radiographic examination. The accuracy of the image simulation is analysed by comparing simulated and acquired images of technical phantoms. As a first application example, the framework is applied to optimise tube voltage and pre-filtration in newborn chest radiography. At equal patient dose, the highest CNR is obtained with low-kV settings in combination with copper filtration. PMID:26628612

  7. Multi-agent based control of large-scale complex systems employing distributed dynamic inference engine

    NASA Astrophysics Data System (ADS)

    Zhang, Daili

    Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications

  8. Simulating mesoscale coastal evolution for decadal coastal management: A new framework integrating multiple, complementary modelling approaches

    NASA Astrophysics Data System (ADS)

    van Maanen, Barend; Nicholls, Robert J.; French, Jon R.; Barkwith, Andrew; Bonaldo, Davide; Burningham, Helene; Brad Murray, A.; Payo, Andres; Sutherland, James; Thornhill, Gillian; Townend, Ian H.; van der Wegen, Mick; Walkden, Mike J. A.

    2016-03-01

    Coastal and shoreline management increasingly needs to consider morphological change occurring at decadal to centennial timescales, especially that related to climate change and sea-level rise. This requires the development of morphological models operating at a mesoscale, defined by time and length scales of the order 101 to 102 years and 101 to 102 km. So-called 'reduced complexity' models that represent critical processes at scales not much smaller than the primary scale of interest, and are regulated by capturing the critical feedbacks that govern landform behaviour, are proving effective as a means of exploring emergent coastal behaviour at a landscape scale. Such models tend to be computationally efficient and are thus easily applied within a probabilistic framework. At the same time, reductionist models, built upon a more detailed description of hydrodynamic and sediment transport processes, are capable of application at increasingly broad spatial and temporal scales. More qualitative modelling approaches are also emerging that can guide the development and deployment of quantitative models, and these can be supplemented by varied data-driven modelling approaches that can achieve new explanatory insights from observational datasets. Such disparate approaches have hitherto been pursued largely in isolation by mutually exclusive modelling communities. Brought together, they have the potential to facilitate a step change in our ability to simulate the evolution of coastal morphology at scales that are most relevant to managing erosion and flood risk. Here, we advocate and outline a new integrated modelling framework that deploys coupled mesoscale reduced complexity models, reductionist coastal area models, data-driven approaches, and qualitative conceptual models. Integration of these heterogeneous approaches gives rise to model compositions that can potentially resolve decadal- to centennial-scale behaviour of diverse coupled open coast, estuary and inner

  9. The effect of casting and masticatory simulation on strain and misfit of implant-supported metal frameworks.

    PubMed

    Bhering, Cláudia Lopes Brilhante; Marques, Isabella da Silva Vieira; Takahashi, Jessica Mie Ferreira Koyama; Barão, Valentim Adelino Ricardo; Consani, Rafael Leonardo Xediek; Mesquita, Marcelo Ferraz

    2016-05-01

    The influence of casting and masticatory simulation on marginal misfit and strain in multiple implant-supported prostheses was evaluated. Three-unit screw retained fixed dental prosthesis (FDP) and screw retained full-arch fixed dental prosthesis (FAFDP) frameworks were made using calcinable or overcasted cylinders on conical dental implant abutment. Four groups were obtained according to the cylinder and prosthesis type (n=10). Frameworks were casted in CoCr alloy and subjected to strain gauge analyses and marginal misfit measurements before and after 10(6) mechanical cycles (2 Hz/280 N). Results were submitted to ANOVA, Tukey's HSD and Pearson correlation test (α=0.05). No difference was found on misfit among all groups and times (p>0.05). Overcasted frameworks showed higher strain than the calcinable ones (FDP - Initial p=0.0047; Final p=0.0004; FAFDP - Initial p=0.0476; Final p=0.0115). The masticatory simulation did not influence strain (p>0.05). No correlation was observed between strain and misfit (r=0.24; p>0.05). In conclusion, the marginal misfit value in the overcasted full-arch frameworks was higher than clinical acceptable data. It proved that overcasted method is not an ideal method for full-arch prosthesis. Overcasted frameworks generate higher strain upon the system. The masticatory simulation had no influence on misfit and strain of multiple prostheses. PMID:26952480

  10. A model framework to represent plant-physiology and rhizosphere processes in soil profile simulation models

    NASA Astrophysics Data System (ADS)

    Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.

    2013-12-01

    of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.

  11. PLANNING AND RESPONSE IN THE AFTERMATH OF A LARGE CRISIS: AN AGENT-BASED INFORMATICS FRAMEWORK*

    PubMed Central

    Barrett, Christopher; Bisset, Keith; Chandan, Shridhar; Chen, Jiangzhuo; Chungbaek, Youngyun; Eubank, Stephen; Evrenosoğlu, Yaman; Lewis, Bryan; Lum, Kristian; Marathe, Achla; Marathe, Madhav; Mortveit, Henning; Parikh, Nidhi; Phadke, Arun; Reed, Jeffrey; Rivers, Caitlin; Saha, Sudip; Stretz, Paula; Swarup, Samarth; Thorp, James; Vullikanti, Anil; Xie, Dawen

    2014-01-01

    We present a synthetic information and modeling environment that can allow policy makers to study various counter-factual experiments in the event of a large human-initiated crisis. The specific scenario we consider is a ground detonation caused by an improvised nuclear device in a large urban region. In contrast to earlier work in this area that focuses largely on the prompt effects on human health and injury, we focus on co-evolution of individual and collective behavior and its interaction with the differentially damaged infrastructure. This allows us to study short term secondary and tertiary effects. The present environment is suitable for studying the dynamical outcomes over a two week period after the initial blast. A novel computing and data processing architecture is described; the architecture allows us to represent multiple co-evolving infrastructures and social networks at a highly resolved temporal, spatial, and individual scale. The representation allows us to study the emergent behavior of individuals as well as specific strategies to reduce casualties and injuries that exploit the spatial and temporal nature of the secondary and tertiary effects. A number of important conclusions are obtained using the modeling environment. For example, the studies decisively show that deploying ad hoc communication networks to reach individuals in the affected area is likely to have a significant impact on the overall casualties and injuries. PMID:25580055

  12. A Framework for Model-Based Inquiry through Agent-Based Programming

    ERIC Educational Resources Information Center

    Xiang, Lin; Passmore, Cynthia

    2015-01-01

    There has been increased recognition in the past decades that model-based inquiry (MBI) is a promising approach for cultivating deep understandings by helping students unite phenomena and underlying mechanisms. Although multiple technology tools have been used to improve the effectiveness of MBI, there are not enough detailed examinations of how…

  13. A Computational Framework for Fluid-Solid-Growth Modeling in Cardiovascular Simulations

    PubMed Central

    Figueroa, C. Alberto; Baek, Seungik; Taylor, Charles A.; Humphrey, Jay D.

    2009-01-01

    It is now well known that altered hemodynamics can alter the genes that are expressed by diverse vascular cells, which in turn plays a critical role in the ability of a blood vessel to adapt to new biomechanical conditions and governs the natural history of the progression of many types of disease. Fortunately, when taken together, recent advances in molecular and cell biology, in vivo medical imaging, biomechanics, computational mechanics, and computing power provide an unprecedented opportunity to begin to understand such hemodynamic effects on vascular biology, physiology, and pathophysiology. Moreover, with increased understanding will come the promise of improved designs for medical devices and clinical interventions. The goal of this paper, therefore, is to present a new computational framework that brings together recent advances in computational biosolid and biofluid mechanics that can exploit new information on the biology of vascular growth and remodeling as well as in vivo patient-specific medical imaging so as to enable realistic simulations of vascular adaptations, disease progression, and clinical intervention. PMID:20160923

  14. Numerical simulation of the Moon's rotation in a rigorous relativistic framework

    NASA Astrophysics Data System (ADS)

    Wang, Zai; Han, Wen-Biao; Tang, Kai; Tao, Jin-He

    2016-06-01

    This paper describes a numerical simulation of the rigid rotation of the Moon in a relativistic framework. Following a resolution passed by the International Astronomical Union (IAU) in 2000, we construct a kinematically non-rotating reference system named the Selenocentric Celestial Reference System (SCRS) and give the time transformation between the Selenocentric Coordinate Time (TCS) and Barycentric Coordinate Time (TCB). The post-Newtonian equations of the Moon's rotation are written in the SCRS, and they are integrated numerically. We calculate the correction to the rotation of the Moon due to total relativistic torque which includes post-Newtonian and gravitomagnetic torques as well as geodetic precession. We find two dominant periods associated with this correction: 18.6yr and 80.1 yr. In addition, the precession of the rotating axes caused by fourth-degree and fifth-degree harmonics of the Moon is also analyzed, and we have found that the main periods of this precession are 27.3d, 2.9 yr, 18.6 yr and 80.1 yr.

  15. Autogenerator-based modelling framework for development of strategic games simulations: rational pigs game extended.

    PubMed

    Fabac, Robert; Radošević, Danijel; Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, "the tramp," one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players. PMID:25254228

  16. Autogenerator-Based Modelling Framework for Development of Strategic Games Simulations: Rational Pigs Game Extended

    PubMed Central

    Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, “the tramp,” one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players. PMID:25254228

  17. Development of a Lattice Boltzmann Framework for Numerical Simulation of Thrombosis

    NASA Astrophysics Data System (ADS)

    Harrison, S. E.; Bernsdorf, J.; Hose, D. R.; Lawford, P. V.

    The interacting factors relating to thrombogenesis were defined by Virchow in 1856 to be abnormalities of blood chemistry, the vessel wall and haemodynamics. Together, these factors are known as Virchow's triad. Many attempts have been made to simulate numerically certain aspects of the complex phenomena of thrombosis, but a comprehensive model, which includes the biochemical and physical aspects of Virchow's triad, and is capable of predicting thrombus development within physiological geometries has not yet been developed. Such a model would consider the role of platelets and the coagulation cascade along with the properties of the flow in the chosen vessel. A lattice Boltzmann thrombosis framework has been developed, on top of an existing flow solver, to model the formation of thrombi resulting from platelet activation and initiation of the coagulation cascade by one or more of the strands of Virchow's triad. Both processes then act in parallel, to restore homeostasis as the deposited thrombus disturbs the flow. Results are presented in a model of deep vein thrombosis (DVT), resulting from hypoxia and associated endothelial damage.

  18. A simulation framework for estimating wall stress distribution of abdominal aortic aneurysm.

    PubMed

    Qin, Jing; Zhang, Jing; Chui, Chee-Kong; Huang, Wei-Min; Yang, Tao; Pang, Wai-Man; Sudhakar, Venkatesh; Chang, Stephen

    2011-01-01

    Abdominal aortic aneurysm (AAA) rupture is believed to occur when the mechanical stress acting on the wall exceeds the strength of the wall tissue. In endovascular aneurysm repair, a stent-graft in a catheter is released at the aneurysm site to form a new blood vessel and protect the weakened AAA wall from the pulsatile pressure and, hence, possible rupture. In this paper, we propose a framework to estimate the wall stress distribution of non-stented/stented AAA based on fluid-structure interaction, which is utilized in a surgical simulation system (IRAS). The 3D geometric model of AAA is reconstructed from computed tomography angiographic (CTA) images. Based on our experiments, a combined logarithm and polynomial strain energy equation is applied to model the elastic properties of arterial wall. The blood flow is modeled as laminar, incompressible, and non-Newtonian flow by applying Navier-Stokes equation. The obtained pressure of blood flow is applied as load on the AAA meshes with and without stent-graft and the wall stress distribution is calculated by fluid-structure interaction (FSI) solver equipped in ANSYS. Experiments demonstrate that our analytical results are consistent with clinical observations. PMID:22254456

  19. Agent Based Modeling of Human Gut Microbiome Interactions and Perturbations

    PubMed Central

    Shashkova, Tatiana; Popenko, Anna; Tyakht, Alexander; Peskov, Kirill; Kosinsky, Yuri; Bogolubsky, Lev; Raigorodskii, Andrei; Ischenko, Dmitry; Alexeev, Dmitry; Govorun, Vadim

    2016-01-01

    Background Intestinal microbiota plays an important role in the human health. It is involved in the digestion and protects the host against external pathogens. Examination of the intestinal microbiome interactions is required for understanding of the community influence on host health. Studies of the microbiome can provide insight on methods of improving health, including specific clinical procedures for individual microbial community composition modification and microbiota correction by colonizing with new bacterial species or dietary changes. Methodology/Principal Findings In this work we report an agent-based model of interactions between two bacterial species and between species and the gut. The model is based on reactions describing bacterial fermentation of polysaccharides to acetate and propionate and fermentation of acetate to butyrate. Antibiotic treatment was chosen as disturbance factor and used to investigate stability of the system. System recovery after antibiotic treatment was analyzed as dependence on quantity of feedback interactions inside the community, therapy duration and amount of antibiotics. Bacterial species are known to mutate and acquire resistance to the antibiotics. The ability to mutate was considered to be a stochastic process, under this suggestion ratio of sensitive to resistant bacteria was calculated during antibiotic therapy and recovery. Conclusion/Significance The model confirms a hypothesis of feedbacks mechanisms necessity for providing functionality and stability of the system after disturbance. High fraction of bacterial community was shown to mutate during antibiotic treatment, though sensitive strains could become dominating after recovery. The recovery of sensitive strains is explained by fitness cost of the resistance. The model demonstrates not only quantitative dynamics of bacterial species, but also gives an ability to observe the emergent spatial structure and its alteration, depending on various feedback mechanisms

  20. A spatial web/agent-based model to support stakeholders' negotiation regarding land development.

    PubMed

    Pooyandeh, Majeed; Marceau, Danielle J

    2013-11-15

    Decision making in land management can be greatly enhanced if the perspectives of concerned stakeholders are taken into consideration. This often implies negotiation in order to reach an agreement based on the examination of multiple alternatives. This paper describes a spatial web/agent-based modeling system that was developed to support the negotiation process of stakeholders regarding land development in southern Alberta, Canada. This system integrates a fuzzy analytic hierarchy procedure within an agent-based model in an interactive visualization environment provided through a web interface to facilitate the learning and negotiation of the stakeholders. In the pre-negotiation phase, the stakeholders compare their evaluation criteria using linguistic expressions. Due to the uncertainty and fuzzy nature of such comparisons, a fuzzy Analytic Hierarchy Process is then used to prioritize the criteria. The negotiation starts by a development plan being submitted by a user (stakeholder) through the web interface. An agent called the proposer, which represents the proposer of the plan, receives this plan and starts negotiating with all other agents. The negotiation is conducted in a step-wise manner where the agents change their attitudes by assigning a new set of weights to their criteria. If an agreement is not achieved, a new location for development is proposed by the proposer agent. This process is repeated until a location is found that satisfies all agents to a certain predefined degree. To evaluate the performance of the model, the negotiation was simulated with four agents, one of which being the proposer agent, using two hypothetical development plans. The first plan was selected randomly; the other one was chosen in an area that is of high importance to one of the agents. While the agents managed to achieve an agreement about the location of the land development after three rounds of negotiation in the first scenario, seven rounds were required in the second

  1. Generic Procedure for Coupling the PHREEQC Geochemical Modeling Framework with Flow and Solute Transport Simulators

    NASA Astrophysics Data System (ADS)

    Wissmeier, L. C.; Barry, D. A.

    2009-12-01

    Computer simulations of water availability and quality play an important role in state-of-the-art water resources management. However, many of the most utilized software programs focus either on physical flow and transport phenomena (e.g., MODFLOW, MT3DMS, FEFLOW, HYDRUS) or on geochemical reactions (e.g., MINTEQ, PHREEQC, CHESS, ORCHESTRA). In recent years, several couplings between both genres of programs evolved in order to consider interactions between flow and biogeochemical reactivity (e.g., HP1, PHWAT). Software coupling procedures can be categorized as ‘close couplings’, where programs pass information via the memory stack at runtime, and ‘remote couplings’, where the information is exchanged at each time step via input/output files. The former generally involves modifications of software codes and therefore expert programming skills are required. We present a generic recipe for remotely coupling the PHREEQC geochemical modeling framework and flow and solute transport (FST) simulators. The iterative scheme relies on operator splitting with continuous re-initialization of PHREEQC and the FST of choice at each time step. Since PHREEQC calculates the geochemistry of aqueous solutions in contact with soil minerals, the procedure is primarily designed for couplings to FST’s for liquid phase flow in natural environments. It requires the accessibility of initial conditions and numerical parameters such as time and space discretization in the input text file for the FST and control of the FST via commands to the operating system (batch on Windows; bash/shell on Unix/Linux). The coupling procedure is based on PHREEQC’s capability to save the state of a simulation with all solid, liquid and gaseous species as a PHREEQC input file by making use of the dump file option in the TRANSPORT keyword. The output from one reaction calculation step is therefore reused as input for the following reaction step where changes in element amounts due to advection

  2. A systematic intercomparison of regional flood frequency analysis models in a simulation framework

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Laio, Francesco; Claps, Pierluigi

    2015-04-01

    Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve (or other discharge-related variables), based on the fundamental concept of substituting temporal information at a site (no data or short time series) by exploiting observations at other sites (spatial information). Different RFA paradigms exist, depending on the way the information is transferred to the site of interest. Despite the wide use of such methodology, a systematic comparison between these paradigms has not been performed. The aim of this study is to provide a framework wherein carrying out the intercomparison: we thus synthetically generate data through Monte Carlo simulations for a number of (virtual) stations, following a GEV parent distribution; different scenarios can be created to represent different spatial heterogeneity patterns by manipulating the parameters of the parent distribution at each station (e.g. with a linear variation in space of the shape parameter of the GEV). A special case is the homogeneous scenario where each station record is sampled from the same parent distribution. For each scenario and each simulation, different regional models are applied to evaluate the 200-year growth factor at each station. Results are than compared to the exact growth factor of each station, which is known in our virtual world. Considered regional approaches include: (i) a single growth curve for the whole region; (ii) a multiple-region model based on cluster analysis which search for an adequate number of homogeneous subregions; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially-smooth estimation procedure based on linear regressions.. A further benchmark model is the at-site estimate based on the analysis of the local record. A comprehensive analysis of the results of the simulations shows that, if the scenario is homogeneous (no spatial variability), all the regional approaches

  3. A modular framework for matter flux simulation at the catchment scale

    NASA Astrophysics Data System (ADS)

    Kraft, P.; Breuer, L.; Vaché, K. B.; Frede, H.-G.

    2009-04-01

    Modeling nutrient fluxes in a catchment is a complex and interdisciplinary task. Building and improving simulation tools for such complex systems is often constraint by the expertise of the engaged scientists: Since different fields of science are involved like vadose zone and ground water hydrology, plant growth, atmospheric exchange, soil chemistry, soil microbiology, stream physics and stream chemistry, a single work group cannot excel in all parts. As a result, either parts of the system, where no scientist involved is an expert, include rough simplifications, or a "complete" group is too big for maintaining the system over a longer period. However, many approaches exist to create complex models that integrate processes for all sub domains. But a tight integration bears the problem of freezing a specific state of science in the complex system. A model infrastructure, which takes the complex feedback loops across domain boundaries (e.g. soil moisture and plant growth) into consideration and is still flexible enough for adoption to new findings in any of the scientific fields is therefore needed. This type of infrastructure can be obtained by a set of independent, but connectible models. The new Catchment Model Framework (cmf), a module for subsurface water and solute transport, is an example of an independent yet open and easily extendible framework for the simulation of water and solute transport processes. Openness is gained by implementing the model as an extension to the Python programming language. Coupling of cmf with models also providing an interface to the Python language dealing with other system compartments, as plant growth, biogeochemical or atmospheric dispersion models etc. can easily be done. The models used in the coupling process can either be spatial explicit models, plot scale models with one instance per mesh node of the landscape model or pure reaction functions using the integration methods of cmf. The concept of extending an existing and

  4. GAMOS: A framework to do GEANT4 simulations in different physics fields with an user-friendly interface

    NASA Astrophysics Data System (ADS)

    Arce, Pedro; Ignacio Lagares, Juan; Harkness, Laura; Pérez-Astudillo, Daniel; Cañadas, Mario; Rato, Pedro; de Prado, María; Abreu, Yamiel; de Lorenzo, Gianluca; Kolstein, Machiel; Díaz, Angelina

    2014-01-01

    GAMOS is a software system for GEANT4-based simulation. It comprises a framework, a set of components providing functionality to simulation applications on top of the GEANT4 toolkit, and a collection of ready-made applications. It allows to perform GEANT4-based simulations using a scripting language, without requiring the writing of C++ code. Moreover, GAMOS design allows the extension of the existing functionality through user-supplied C++ classes. The main characteristics of GAMOS and its embedded functionality are described.

  5. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach

    PubMed Central

    2016-01-01

    Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235

  6. A task-oriented modular and agent-based collaborative design mechanism for distributed product development

    NASA Astrophysics Data System (ADS)

    Liu, Jinfei; Chen, Ming; Wang, Lei; Wu, Qidi

    2014-05-01

    The rapid expansion of enterprises makes product collaborative design (PCD) a critical issue under the distributed heterogeneous environment, but as the collaborative task of large-scale network becomes more complicated, neither unified task decomposition and allocation methodology nor Agent-based network management platform can satisfy the increasing demands. In this paper, to meet requirements of PCD for distributed product development, a collaborative design mechanism based on the thought of modularity and the Agent technology is presented. First, the top-down 4-tier process model based on task-oriented modular and Agent is constructed for PCD after analyzing the mapping relationships between requirements and functions in the collaborative design. Second, on basis of sub-task decomposition for PCD based on a mixed method, the mathematic model of task-oriented modular based on multi-objective optimization is established to maximize the module cohesion degree and minimize the module coupling degree, while considering the module executable degree as a restriction. The mathematic model is optimized and simulated by the modified PSO, and the decomposed modules are obtained. Finally, the Agent structure model for collaborative design is put forward, and the optimism matching Agents are selected by using similarity algorithm to implement different task-modules by the integrated reasoning and decision-making mechanism with the behavioral model of collaborative design Agents. With the results of experimental studies for automobile collaborative design, the feasibility and efficiency of this methodology of task-oriented modular and Agent-based collaborative design in the distributed heterogeneous environment are verified. On this basis, an integrative automobile collaborative R&D platform is developed. This research provides an effective platform for automobile manufacturing enterprises to achieve PCD, and helps to promote product numeralization collaborative R&D and

  7. Combination HIV prevention among MSM in South Africa: results from agent-based modeling.

    PubMed

    Brookmeyer, Ron; Boren, David; Baral, Stefan D; Bekker, Linda-Gail; Phaswana-Mafuya, Nancy; Beyrer, Chris; Sullivan, Patrick S

    2014-01-01

    HIV prevention trials have demonstrated the effectiveness of a number of behavioral and biomedical interventions. HIV prevention packages are combinations of interventions and offer potential to significantly increase the effectiveness of any single intervention. Estimates of the effectiveness of prevention packages are important for guiding the development of prevention strategies and for characterizing effect sizes before embarking on large scale trials. Unfortunately, most research to date has focused on testing single interventions rather than HIV prevention packages. Here we report the results from agent-based modeling of the effectiveness of HIV prevention packages for men who have sex with men (MSM) in South Africa. We consider packages consisting of four components: antiretroviral therapy for HIV infected persons with CD4 count <350; PrEP for high risk uninfected persons; behavioral interventions to reduce rates of unprotected anal intercourse (UAI); and campaigns to increase HIV testing. We considered 163 HIV prevention packages corresponding to different intensity levels of the four components. We performed 2252 simulation runs of our agent-based model to evaluate those packages. We found that a four component package consisting of a 15% reduction in the rate of UAI, 50% PrEP coverage of high risk uninfected persons, 50% reduction in persons who never test for HIV, and 50% ART coverage over and above persons already receiving ART at baseline, could prevent 33.9% of infections over 5 years (95% confidence interval, 31.5, 36.3). The package components with the largest incremental prevention effects were UAI reduction and PrEP coverage. The impact of increased HIV testing was magnified in the presence of PrEP. We find that HIV prevention packages that include both behavioral and biomedical components can in combination prevent significant numbers of infections with levels of coverage, acceptance and adherence that are potentially achievable among MSM in

  8. Combination HIV Prevention among MSM in South Africa: Results from Agent-based Modeling

    PubMed Central

    Brookmeyer, Ron; Boren, David; Baral, Stefan D.; Bekker, Linda- Gail; Phaswana-Mafuya, Nancy; Beyrer, Chris; Sullivan, Patrick S.

    2014-01-01

    HIV prevention trials have demonstrated the effectiveness of a number of behavioral and biomedical interventions. HIV prevention packages are combinations of interventions and offer potential to significantly increase the effectiveness of any single intervention. Estimates of the effectiveness of prevention packages are important for guiding the development of prevention strategies and for characterizing effect sizes before embarking on large scale trials. Unfortunately, most research to date has focused on testing single interventions rather than HIV prevention packages. Here we report the results from agent-based modeling of the effectiveness of HIV prevention packages for men who have sex with men (MSM) in South Africa. We consider packages consisting of four components: antiretroviral therapy for HIV infected persons with CD4 count <350; PrEP for high risk uninfected persons; behavioral interventions to reduce rates of unprotected anal intercourse (UAI); and campaigns to increase HIV testing. We considered 163 HIV prevention packages corresponding to different intensity levels of the four components. We performed 2252 simulation runs of our agent-based model to evaluate those packages. We found that a four component package consisting of a 15% reduction in the rate of UAI, 50% PrEP coverage of high risk uninfected persons, 50% reduction in persons who never test for HIV, and 50% ART coverage over and above persons already receiving ART at baseline, could prevent 33.9% of infections over 5 years (95% confidence interval, 31.5, 36.3). The package components with the largest incremental prevention effects were UAI reduction and PrEP coverage. The impact of increased HIV testing was magnified in the presence of PrEP. We find that HIV prevention packages that include both behavioral and biomedical components can in combination prevent significant numbers of infections with levels of coverage, acceptance and adherence that are potentially achievable among MSM in

  9. Holistic flood risk assessment using agent-based modelling: the case of Sint Maarten Island

    NASA Astrophysics Data System (ADS)

    Abayneh Abebe, Yared; Vojinovic, Zoran; Nikolic, Igor; Hammond, Michael; Sanchez, Arlex; Pelling, Mark

    2015-04-01

    Floods in coastal regions are regarded as one of the most dangerous and harmful disasters. Though commonly referred to as natural disasters, coastal floods are also attributable to various social, economic, historical and political issues. Rapid urbanisation in coastal areas combined with climate change and poor governance can lead to a significant increase in the risk of pluvial flooding coinciding with fluvial and coastal flooding posing a greater risk of devastation in coastal communities. Disasters that can be triggered by hydro-meteorological events are interconnected and interrelated with both human activities and natural processes. They, therefore, require holistic approaches to help understand their complexity in order to design and develop adaptive risk management approaches that minimise social and economic losses and environmental impacts, and increase resilience to such events. Being located in the North Atlantic Ocean, Sint Maarten is frequently subjected to hurricanes. In addition, the stormwater catchments and streams on Sint Maarten have several unique characteristics that contribute to the severity of flood-related impacts. Urban environments are usually situated in low-lying areas, with little consideration for stormwater drainage, and as such are subject to flash flooding. Hence, Sint Maarten authorities drafted policies to minimise the risk of flood-related disasters on the island. In this study, an agent-based model is designed and applied to understand the implications of introduced policies and regulations, and to understand how different actors' behaviours influence the formation, propagation and accumulation of flood risk. The agent-based model built for this study is based on the MAIA meta-model, which helps to decompose, structure and conceptualize socio-technical systems with an agent-oriented perspective, and is developed using the NetLogo simulation environment. The agents described in this model are households and businesses, and

  10. Agent-Based Model with Asymmetric Trading and Herding for Complex Financial Systems

    PubMed Central

    Chen, Jun-Jie; Zheng, Bo; Tan, Lei

    2013-01-01

    Background For complex financial systems, the negative and positive return-volatility correlations, i.e., the so-called leverage and anti-leverage effects, are particularly important for the understanding of the price dynamics. However, the microscopic origination of the leverage and anti-leverage effects is still not understood, and how to produce these effects in agent-based modeling remains open. On the other hand, in constructing microscopic models, it is a promising conception to determine model parameters from empirical data rather than from statistical fitting of the results. Methods To study the microscopic origination of the return-volatility correlation in financial systems, we take into account the individual and collective behaviors of investors in real markets, and construct an agent-based model. The agents are linked with each other and trade in groups, and particularly, two novel microscopic mechanisms, i.e., investors’ asymmetric trading and herding in bull and bear markets, are introduced. Further, we propose effective methods to determine the key parameters in our model from historical market data. Results With the model parameters determined for six representative stock-market indices in the world, respectively, we obtain the corresponding leverage or anti-leverage effect from the simulation, and the effect is in agreement with the empirical one on amplitude and duration. At the same time, our model produces other features of the real markets, such as the fat-tail distribution of returns and the long-term correlation of volatilities. Conclusions We reveal that for the leverage and anti-leverage effects, both the investors’ asymmetric trading and herding are essential generation mechanisms. Among the six markets, however, the investors’ trading is approximately symmetric for the five markets which exhibit the leverage effect, thus contributing very little. These two microscopic mechanisms and the methods for the determination of the key

  11. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    NASA Astrophysics Data System (ADS)

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-06-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these agents obey simple rules assigned or manipulated by the user (e.g., speeding up, slowing down, etc.). It is the interactions between these agents, based on the rules assigned by the user, that give rise to emergent, aggregate-level behavior (e.g., formation and movement of the traffic jam). Natural selection is such an emergent phenomenon, which has been shown to be challenging for novices (K16 students) to understand. Whereas prior research on learning evolutionary phenomena with MABMs has typically focused on high school students and beyond, we investigate how elementary students (4th graders) develop multi-level explanations of some introductory aspects of natural selection—species differentiation and population change—through scaffolded interactions with an MABM that simulates predator-prey dynamics in a simple birds-butterflies ecosystem. We conducted a semi-clinical interview based study with ten participants, in which we focused on the following: a) identifying the nature of learners' initial interpretations of salient events or elements of the represented phenomena, b) identifying the roles these interpretations play in the development of their multi-level explanations, and c) how attending to different levels of the relevant phenomena can make explicit different mechanisms to the learners. In addition, our analysis also shows that although there were differences between high- and low-performing students (in terms of being able to explain population-level behaviors) in the pre-test, these differences disappeared in the post-test.

  12. Towards a Hybrid Agent-based Model for Mosquito Borne Disease

    PubMed Central

    Mniszewski, S. M.; Manore, C. A.; Bryan, C.; Del Valle, S. Y.; Roberts, D.

    2015-01-01

    Agent-based models (ABM) are used to simulate the spread of infectious disease through a population. Detailed human movement, demography, realistic business location networks, and in-host disease progression are available in existing ABMs, such as the Epidemic Simulation System (EpiSimS). These capabilities make possible the exploration of pharmaceutical and non-pharmaceutical mitigation strategies used to inform the public health community. There is a similar need for the spread of mosquito borne pathogens due to the re-emergence of diseases such as chikungunya and dengue fever. A network-patch model for mosquito dynamics has been coupled with EpiSimS. Mosquitoes are represented as a “patch” or “cloud” associated with a location. Each patch has an ordinary differential equation (ODE) mosquito dynamics model and mosquito related parameters relevant to the location characteristics. Activities at each location can have different levels of potential exposure to mosquitoes based on whether they are inside, outside, or somewhere in-between. As a proof of concept, the hybrid network-patch model is used to simulate the spread of chikungunya through Washington, DC. Results are shown for a base case, followed by varying the probability of transmission, mosquito count, and activity exposure. We use visualization to understand the pattern of disease spread. PMID:26618203

  13. Agent Based Modeling of Atherosclerosis: A Concrete Help in Personalized Treatments

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Cincotti, Alessandro; Motta, Alfredo; Pennisi, Marzio

    Atherosclerosis, a pathology affecting arterial blood vessels, is one of most common diseases of the developed countries. We present studies on the increased atherosclerosis risk using an agent based model of atherogenesis that has been previously validated using clinical data. It is well known that the major risk in atherosclerosis is the persistent high level of low density lipoprotein (LDL) concentration. However, it is not known if short period of high LDL concentration can cause irreversible damage and if reduction of the LDL concentration (either by life style or drug) can drastically or partially reduce the already acquired risk. We simulated four different clinical situations in a large set of virtual patients (200 per clinical scenario). In the first one the patients lifestyle maintains the concentration of LDL in a no risk range. This is the control case simulation. The second case is represented by patients having high level of LDL with a delay to apply appropriate treatments; The third scenario is characterized by patients with high LDL levels treated with specific drugs like statins. Finally we simulated patients that are characterized by several oxidative events (smoke, sedentary life style, assumption of alcoholic drinks and so on so forth) that effective increase the risk of LDL oxidation. Those preliminary results obviously need to be clinically investigated. It is clear, however, that SimAthero has the power to concretely help medical doctors and clinicians in choosing personalized treatments for the prevention of the atherosclerosis damages.

  14. Ising-like agent-based technology diffusion model: Adoption patterns vs. seeding strategies

    NASA Astrophysics Data System (ADS)

    Laciana, Carlos E.; Rovere, Santiago L.

    2011-03-01

    The well-known Ising model used in statistical physics was adapted to a social dynamics context to simulate the adoption of a technological innovation. The model explicitly combines (a) an individual's perception of the advantages of an innovation and (b) social influence from members of the decision-maker's social network. The micro-level adoption dynamics are embedded into an agent-based model that allows exploration of macro-level patterns of technology diffusion throughout systems with different configurations (number and distributions of early adopters, social network topologies). In the present work we carry out many numerical simulations. We find that when the gap between the individual's perception of the options is high, the adoption speed increases if the dispersion of early adopters grows. Another test was based on changing the network topology by means of stochastic connections to a common opinion reference (hub), which resulted in an increment in the adoption speed. Finally, we performed a simulation of competition between options for both regular and small world networks.

  15. On Complexities of Impact Simulation of Fiber Reinforced Polymer Composites: A Simplified Modeling Framework

    PubMed Central

    Alemi-Ardakani, M.; Milani, A. S.; Yannacopoulos, S.

    2014-01-01

    Impact modeling of fiber reinforced polymer composites is a complex and challenging task, in particular for practitioners with less experience in advanced coding and user-defined subroutines. Different numerical algorithms have been developed over the past decades for impact modeling of composites, yet a considerable gap often exists between predicted and experimental observations. In this paper, after a review of reported sources of complexities in impact modeling of fiber reinforced polymer composites, two simplified approaches are presented for fast simulation of out-of-plane impact response of these materials considering four main effects: (a) strain rate dependency of the mechanical properties, (b) difference between tensile and flexural bending responses, (c) delamination, and (d) the geometry of fixture (clamping conditions). In the first approach, it is shown that by applying correction factors to the quasistatic material properties, which are often readily available from material datasheets, the role of these four sources in modeling impact response of a given composite may be accounted for. As a result a rough estimation of the dynamic force response of the composite can be attained. To show the application of the approach, a twill woven polypropylene/glass reinforced thermoplastic composite laminate has been tested under 200 J impact energy and was modeled in Abaqus/Explicit via the built-in Hashin damage criteria. X-ray microtomography was used to investigate the presence of delamination inside the impacted sample. Finally, as a second and much simpler modeling approach it is shown that applying only a single correction factor over all material properties at once can still yield a reasonable prediction. Both advantages and limitations of the simplified modeling framework are addressed in the performed case study. PMID:25431787

  16. On complexities of impact simulation of fiber reinforced polymer composites: a simplified modeling framework.

    PubMed

    Alemi-Ardakani, M; Milani, A S; Yannacopoulos, S

    2014-01-01

    Impact modeling of fiber reinforced polymer composites is a complex and challenging task, in particular for practitioners with less experience in advanced coding and user-defined subroutines. Different numerical algorithms have been developed over the past decades for impact modeling of composites, yet a considerable gap often exists between predicted and experimental observations. In this paper, after a review of reported sources of complexities in impact modeling of fiber reinforced polymer composites, two simplified approaches are presented for fast simulation of out-of-plane impact response of these materials considering four main effects: (a) strain rate dependency of the mechanical properties, (b) difference between tensile and flexural bending responses, (c) delamination, and (d) the geometry of fixture (clamping conditions). In the first approach, it is shown that by applying correction factors to the quasistatic material properties, which are often readily available from material datasheets, the role of these four sources in modeling impact response of a given composite may be accounted for. As a result a rough estimation of the dynamic force response of the composite can be attained. To show the application of the approach, a twill woven polypropylene/glass reinforced thermoplastic composite laminate has been tested under 200 J impact energy and was modeled in Abaqus/Explicit via the built-in Hashin damage criteria. X-ray microtomography was used to investigate the presence of delamination inside the impacted sample. Finally, as a second and much simpler modeling approach it is shown that applying only a single correction factor over all material properties at once can still yield a reasonable prediction. Both advantages and limitations of the simplified modeling framework are addressed in the performed case study. PMID:25431787

  17. Infectio: a Generic Framework for Computational Simulation of Virus Transmission between Cells

    PubMed Central

    Yakimovich, Artur; Yakimovich, Yauhen; Schmid, Michael; Mercer, Jason; Sbalzarini, Ivo F.

    2016-01-01

    ABSTRACT Viruses spread between cells, tissues, and organisms by cell-free and cell-cell mechanisms, depending on the cell type, the nature of the virus, or the phase of the infection cycle. The mode of viral transmission has a large impact on disease development, the outcome of antiviral therapies or the efficacy of gene therapy protocols. The transmission mode of viruses can be addressed in tissue culture systems using live-cell imaging. Yet even in relatively simple cell cultures, the mechanisms of viral transmission are difficult to distinguish. Here we present a cross-platform software framework called “Infectio,” which is capable of simulating transmission phenotypes in tissue culture of virtually any virus. Infectio can estimate interdependent biological parameters, for example for vaccinia virus infection, and differentiate between cell-cell and cell-free virus spreading. Infectio assists in elucidating virus transmission mechanisms, a feature useful for designing strategies of perturbing or enhancing viral transmission. The complexity of the Infectio software is low compared to that of other software commonly used to quantitate features of cell biological images, which yields stable and relatively error-free output from Infectio. The software is open source (GPLv3 license), and operates on the major platforms (Windows, Mac, and Linux). The complete source code can be downloaded from http://infectio.github.io/index.html. IMPORTANCE Infectio presents a generalized platform to analyze virus infection spread between cells. It allows the simulation of plaque phenotypes from image-based assays. Viral plaques are the result of virus spreading from primary infected cells to neighboring cells. This is a complex process and involves neighborhood effects at cell-cell contact sites or fluid dynamics in the extracellular medium. Infectio differentiates between two major modes of virus transmission between cells, allowing in silico testing of hypotheses about

  18. Infectio: a Generic Framework for Computational Simulation of Virus Transmission between Cells.

    PubMed

    Yakimovich, Artur; Yakimovich, Yauhen; Schmid, Michael; Mercer, Jason; Sbalzarini, Ivo F; Greber, Urs F

    2016-01-01

    Viruses spread between cells, tissues, and organisms by cell-free and cell-cell mechanisms, depending on the cell type, the nature of the virus, or the phase of the infection cycle. The mode of viral transmission has a large impact on disease development, the outcome of antiviral therapies or the efficacy of gene therapy protocols. The transmission mode of viruses can be addressed in tissue culture systems using live-cell imaging. Yet even in relatively simple cell cultures, the mechanisms of viral transmission are difficult to distinguish. Here we present a cross-platform software framework called "Infectio," which is capable of simulating transmission phenotypes in tissue culture of virtually any virus. Infectio can estimate interdependent biological parameters, for example for vaccinia virus infection, and differentiate between cell-cell and cell-free virus spreading. Infectio assists in elucidating virus transmission mechanisms, a feature useful for designing strategies of perturbing or enhancing viral transmission. The complexity of the Infectio software is low compared to that of other software commonly used to quantitate features of cell biological images, which yields stable and relatively error-free output from Infectio. The software is open source (GPLv3 license), and operates on the major platforms (Windows, Mac, and Linux). The complete source code can be downloaded from http://infectio.github.io/index.html. IMPORTANCE Infectio presents a generalized platform to analyze virus infection spread between cells. It allows the simulation of plaque phenotypes from image-based assays. Viral plaques are the result of virus spreading from primary infected cells to neighboring cells. This is a complex process and involves neighborhood effects at cell-cell contact sites or fluid dynamics in the extracellular medium. Infectio differentiates between two major modes of virus transmission between cells, allowing in silico testing of hypotheses about spreading

  19. KMCLib: A general framework for lattice kinetic Monte Carlo (KMC) simulations

    NASA Astrophysics Data System (ADS)

    Leetmaa, Mikael; Skorodumova, Natalia V.

    2014-09-01

    KMCLib is a general framework for lattice kinetic Monte Carlo (KMC) simulations. The program can handle simulations of the diffusion and reaction of millions of particles in one, two, or three dimensions, and is designed to be easily extended and customized by the user to allow for the development of complex custom KMC models for specific systems without having to modify the core functionality of the program. Analysis modules and on-the-fly elementary step diffusion rate calculations can be implemented as plugins following a well-defined API. The plugin modules are loosely coupled to the core KMCLib program via the Python scripting language. KMCLib is written as a Python module with a backend C++ library. After initial compilation of the backend library KMCLib is used as a Python module; input to the program is given as a Python script executed using a standard Python interpreter. We give a detailed description of the features and implementation of the code and demonstrate its scaling behavior and parallel performance with a simple one-dimensional A-B-C lattice KMC model and a more complex three-dimensional lattice KMC model of oxygen-vacancy diffusion in a fluorite structured metal oxide. KMCLib can keep track of individual particle movements and includes tools for mean square displacement analysis, and is therefore particularly well suited for studying diffusion processes at surfaces and in solids. Catalogue identifier: AESZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 49 064 No. of bytes in distributed program, including test data, etc.: 1 575 172 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer that can run a C++ compiler and a Python interpreter. Operating system: Tested on Ubuntu 12

  20. From Agents to Continuous Change via Aesthetics: Learning Mechanics with Visual Agent-Based Computational Modeling

    ERIC Educational Resources Information Center

    Sengupta, Pratim; Farris, Amy Voss; Wright, Mason

    2012-01-01

    Novice learners find motion as a continuous process of change challenging to understand. In this paper, we present a pedagogical approach based on agent-based, visual programming to address this issue. Integrating agent-based programming, in particular, Logo programming, with curricular science has been shown to be challenging in previous research…