Science.gov

Sample records for agent-based computer simulation

  1. Agent Based Computing Machine

    DTIC Science & Technology

    2005-12-09

    coordinates as in cellular automata systems. But using biology as a model suggests that the most general systems must provide for partial, but constrained...17. SECURITY CLASSIFICATION OF 118. SECURITY CLASSIFICATION OF 19. SECURITY CLASSIFICATION OF 20. LIMITATION OF ABSTRA REPORT THIS PAGE ABSTRACT...system called an "agent based computing" machine (ABC Machine). The ABC Machine is motivated by cellular biochemistry and it is based upon a concept

  2. Agent-based computer simulation and sirs: building a bridge between basic science and clinical trials.

    PubMed

    An, G

    2001-10-01

    The management of Systemic Inflammatory Response Syndrome (SIRS)/Multiple Organ Failure (MOF) remains the greatest challenge in the field of critical care. There has been uniform difficulty in translating the results of basic science research into effective therapeutic regimes. We propose that this is due in part to a failure to account for the complex, nonlinear nature of the inflammatory process of which SIRS/MOF represents a disordered state. Attempts to manipulate this process without an understanding of the dynamics of the system may potentially produce unintended consequences. Agent-Based Computer Simulation (ABCS) provides a means to synthesize the information acquired from the linear analysis of basic science into a model that preserves the complexity of the inflammatory system. We have constructed an abstracted version of the inflammatory process using an ABCS that is based at the cellular level. Despite its abstraction, the simulation produces non-linear behavior and reproduces the dynamic structure of the inflammatory response. Furthermore, adjustment of the simulation to model one of the unsuccessful initial anti-inflammatory trials of the 1990's demonstrates the adverse outcome that was observed in those clinical trials. It must be emphasized that the current model is extremely abstract and simplified. However, it is hoped that future ABCSs of sufficient sophistication eventually may provide an important bridging tool to translate basic science discoveries into clinical applications. Creating these simulations will require a large collaborative effort, and it is hoped that this paper will stimulate interest in this form of analysis.

  3. Agent Based Simulation Output Analysis

    DTIC Science & Technology

    2011-12-01

    over long periods of time) not to have a steady state, but apparently does. These simulation models are available free from sigmawiki.com 2.1...are used in computer animations and movies (for example, in the movie Jurassic Park) as well as to look for emergent social behavior in groups

  4. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    SciTech Connect

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  5. Incorporating fault tolerance in distributed agent based systems by simulating bio-computing model of stress pathways

    NASA Astrophysics Data System (ADS)

    Bansal, Arvind K.

    2006-05-01

    Bio-computing model of 'Distributed Multiple Intelligent Agents Systems' (BDMIAS) models agents as genes, a cooperating group of agents as operons - commonly regulated groups of genes, and the complex task as a set of interacting pathways such that the pathways involve multiple cooperating operons. The agents (or groups of agents) interact with each other using message passing and pattern based bindings that may reconfigure agent's function temporarily. In this paper, a technique has been described for incorporating fault tolerance in BDMIAS. The scheme is based upon simulating BDMIAS, exploiting the modeling of biological stress pathways, integration of fault avoidance, and distributed fault recovery of the crashed agents. Stress pathways are latent pathways in biological system that gets triggered very quickly, regulate the complex biological system by temporarily regulating or inactivating the undesirable pathways, and are essential to avoid catastrophic failures. Pattern based interaction between messages and agents allow multiple agents to react concurrently in response to single condition change represented by a message broadcast. The fault avoidance exploits the integration of the intelligent processing rate control using message based loop feedback and temporary reconfiguration that alters the data flow between functional modules within an agent, and may alter. The fault recovery exploits the concept of semi passive shadow agents - one on the local machine and other on the remote machine, dynamic polling of machines, logically time stamped messages to avoid message losses, and distributed archiving of volatile part of agent state on distributed machines. Various algorithms have been described.

  6. Simulating cancer growth with multiscale agent-based modeling.

    PubMed

    Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S

    2015-02-01

    There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models.

  7. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  8. Agent-based simulation of a financial market

    NASA Astrophysics Data System (ADS)

    Raberto, Marco; Cincotti, Silvano; Focardi, Sergio M.; Marchesi, Michele

    2001-10-01

    This paper introduces an agent-based artificial financial market in which heterogeneous agents trade one single asset through a realistic trading mechanism for price formation. Agents are initially endowed with a finite amount of cash and a given finite portfolio of assets. There is no money-creation process; the total available cash is conserved in time. In each period, agents make random buy and sell decisions that are constrained by available resources, subject to clustering, and dependent on the volatility of previous periods. The model proposed herein is able to reproduce the leptokurtic shape of the probability density of log price returns and the clustering of volatility. Implemented using extreme programming and object-oriented technology, the simulator is a flexible computational experimental facility that can find applications in both academic and industrial research projects.

  9. Modeling civil violence: An agent-based computational approach

    PubMed Central

    Epstein, Joshua M.

    2002-01-01

    This article presents an agent-based computational model of civil violence. Two variants of the civil violence model are presented. In the first a central authority seeks to suppress decentralized rebellion. In the second a central authority seeks to suppress communal violence between two warring ethnic groups. PMID:11997450

  10. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  11. Solution of partial differential equations by agent-based simulation

    NASA Astrophysics Data System (ADS)

    Szilagyi, Miklos N.

    2014-01-01

    The purpose of this short note is to demonstrate that partial differential equations can be quickly solved by agent-based simulation with high accuracy. There is no need for the solution of large systems of algebraic equations. This method is especially useful for quick determination of potential distributions and demonstration purposes in teaching electromagnetism.

  12. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  13. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool.

  14. On agent-based modeling and computational social science

    PubMed Central

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  15. On agent-based modeling and computational social science.

    PubMed

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS.

  16. Agent-Based Computational Modeling of Cell Culture ...

    EPA Pesticide Factsheets

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assumed a “fried egg shape” but became increasingly cuboidal with increasing confluency. The surface area presented by each cell to the overlying medium varies from cell-to-cell and is a determinant of diffusional flux of toxicant from the medium into the cell. Thus, dose varies among cells for a given concentration of toxicant in the medium. Computer code describing diffusion of H2O2 from medium into each cell and clearance of H2O2 was calibrated against H2O2 time-course data (25, 50, or 75 uM H2O2 for 60 min) obtained with the Amplex Red assay for the medium and the H2O2-sensitive fluorescent reporter, HyPer, for cytosol. Cellular H2O2 concentrations peaked at about 5 min and were near baseline by 10 min. The model predicted a skewed distribution of surface areas, with between cell variation usually 2 fold or less. Predicted variability in cellular dose was in rough agreement with the variation in the HyPer data. These results are preliminary, as the model was not calibrated to the morphology of a specific cell type. Future work will involve morphology model calibration against human bronchial epithelial (BEAS-2B) cells. Our results show, however, the potential of agent-based modeling

  17. Research on monocentric model of urbanization by agent-based simulation

    NASA Astrophysics Data System (ADS)

    Xue, Ling; Yang, Kaizhong

    2008-10-01

    Over the past years, GIS have been widely used for modeling urbanization from a variety of perspectives such as digital terrain representation and overlay analysis using cell-based data platform. Similarly, simulation of urban dynamics has been achieved with the use of Cellular Automata. In contrast to these approaches, agent-based simulation provides a much more powerful set of tools. This allows researchers to set up a counterpart for real environmental and urban systems in computer for experimentation and scenario analysis. This Paper basically reviews the research on the economic mechanism of urbanization and an agent-based monocentric model is setup for further understanding the urbanization process and mechanism in China. We build an endogenous growth model with dynamic interactions between spatial agglomeration and urban development by using agent-based simulation. It simulates the migration decisions of two main types of agents, namely rural and urban households between rural and urban area. The model contains multiple economic interactions that are crucial in understanding urbanization and industrial process in China. These adaptive agents can adjust their supply and demand according to the market situation by a learning algorithm. The simulation result shows this agent-based urban model is able to perform the regeneration and to produce likely-to-occur projections of reality.

  18. Using Agent Based Modeling (ABM) to Develop Cultural Interaction Simulations

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Jones, Phillip N.

    2012-01-01

    Today, most cultural training is based on or built around "cultural engagements" or discrete interactions between the individual learner and one or more cultural "others". Often, success in the engagement is the end or the objective. In reality, these interactions usually involve secondary and tertiary effects with potentially wide ranging consequences. The concern is that learning culture within a strict engagement context might lead to "checklist" cultural thinking that will not empower learners to understand the full consequence of their actions. We propose the use of agent based modeling (ABM) to collect, store, and, simulating the effects of social networks, promulgate engagement effects over time, distance, and consequence. The ABM development allows for rapid modification to re-create any number of population types, extending the applicability of the model to any requirement for social modeling.

  19. Agent-based modeling to simulate the dengue spread

    NASA Astrophysics Data System (ADS)

    Deng, Chengbin; Tao, Haiyan; Ye, Zhiwei

    2008-10-01

    In this paper, we introduce a novel method ABM in simulating the unique process for the dengue spread. Dengue is an acute infectious disease with a long history of over 200 years. Unlike the diseases that can be transmitted directly from person to person, dengue spreads through a must vector of mosquitoes. There is still no any special effective medicine and vaccine for dengue up till now. The best way to prevent dengue spread is to take precautions beforehand. Thus, it is crucial to detect and study the dynamic process of dengue spread that closely relates to human-environment interactions where Agent-Based Modeling (ABM) effectively works. The model attempts to simulate the dengue spread in a more realistic way in the bottom-up way, and to overcome the limitation of ABM, namely overlooking the influence of geographic and environmental factors. Considering the influence of environment, Aedes aegypti ecology and other epidemiological characteristics of dengue spread, ABM can be regarded as a useful way to simulate the whole process so as to disclose the essence of the evolution of dengue spread.

  20. Serious games experiment toward agent-based simulation

    USGS Publications Warehouse

    Wein, Anne; Labiosa, William

    2013-01-01

    We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information

  1. Model reduction for agent-based social simulation: coarse-graining a civil violence model.

    PubMed

    Zou, Yu; Fonoberov, Vladimir A; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G

    2012-06-01

    Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20).

  2. Model reduction for agent-based social simulation: Coarse-graining a civil violence model

    NASA Astrophysics Data System (ADS)

    Zou, Yu; Fonoberov, Vladimir A.; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G.

    2012-06-01

    Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20).

  3. Identifying Evacuees' Demand of Tsunami Shelters using Agent Based Simulation

    NASA Astrophysics Data System (ADS)

    Mas, E.; Adriano, B.; Koshimura, S.; Imamura, F.; Kuroiwa, J.; Yamazaki, F.; Zavala, C.; Estrada, M.

    2012-12-01

    Amongst the lessons learned in tsunami events such as the 2004 Indian Ocean and 2011 Great Tohoku Japan earthquake is that sometimes nature exceeds structural countermeasures like seawalls, breakwaters or tsunami gates. In such situations it is a challenging task for people in plain areas to find sheltering places. The vertical evacuation to multistory buildings is one alternative to provide areas for sheltering in a complex environment of evacuation. However, if the spatial distribution and the available capacity of these structures are not well displayed, conditions of evacuee over-demand or under-demand might be observed in several structures. In this study, we present the integration of the tsunami numerical modeling and the agent based simulation of evacuation as the method to estimate the sheltering demand of evacuees in an emergent behavior approach. The case study is set in La Punta district in Peru. Here, we used in the tsunami simulation a seismic source of slip distribution model (Pulido et.al. ,2011; Chlieh et.al, 2011) for a possible future tsunami scenario in the central Andes. We modeled three alternatives of evacuation. First, the horizontal evacuation scenario was analyzed to support the necessity of the sheltering-in-place option for the district. Second, the vertical evacuation scenario and third, the combination of vertical and horizontal evacuation scenarios of pedestrians and vehicles were conducted. In the last two alternatives, the demand of evacuees were measured at each official tsunami evacuation building and compared to the sheltering capacity of the structure. Results showed that out of twenty tsunami evacuation buildings, thirteen resulted with over-demands and seven were still with available space. Also it is confirmed that in this case the horizontal evacuation might lead to a high number of casualties due to the traffic congestion at the neck of the district. Finally the vertical evacuation would be a suitable solution for this area

  4. Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.

    2014-12-01

    Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.

  5. Agent-based Approaches to Dynamic Team Simulation

    DTIC Science & Technology

    2008-09-01

    behavior. The second section reviews agent-based models of teamwork describing work involving both teamwork approaches to design of multiagent systems...there is less direct evidence for teams. Hough (1992), for example, found that ratings on conscientiousness, emotional stability, and agreeableness...Peeters, Rutte, Tuijl, and Reymen (2006) who found agreeableness and emotional stability positively related to satisfaction with the team make

  6. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  7. An equation-free approach to agent-based computation: Bifurcation analysis and control of stationary states

    NASA Astrophysics Data System (ADS)

    Siettos, C. I.; Gear, C. W.; Kevrekidis, I. G.

    2012-08-01

    We show how the equation-free approach can be exploited to enable agent-based simulators to perform system-level computations such as bifurcation, stability analysis and controller design. We illustrate these tasks through an event-driven agent-based model describing the dynamic behaviour of many interacting investors in the presence of mimesis. Using short bursts of appropriately initialized runs of the detailed, agent-based simulator, we construct the coarse-grained bifurcation diagram of the (expected) density of agents and investigate the stability of its multiple solution branches. When the mimetic coupling between agents becomes strong enough, the stable stationary state loses its stability at a coarse turning point bifurcation. We also demonstrate how the framework can be used to design a wash-out dynamic controller that stabilizes open-loop unstable stationary states even under model uncertainty.

  8. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  9. AN AGENT-BASED SIMULATION STUDY OF A COMPLEX ADAPTIVE COLLABORATION NETWORK

    SciTech Connect

    Ozmen, Ozgur; Smith, Jeffrey; Yilmaz, Levent

    2013-01-01

    One of the most significant problems in organizational scholarship is to discern how social collectives govern, organize, and coordinate the actions of individuals to achieve collective outcomes. The collectives are usually interpreted as complex adaptive systems (CAS). The understanding of CAS is more likely to arise with the help of computer-based simulations. In this tutorial, using agent-based modeling approach, a complex adaptive social communication network model is introduced. The objective is to present the underlying dynamics of the system in a form of computer simulation that enables analyzing the impacts of various mechanisms on network topologies and emergent behaviors. The ultimate goal is to further our understanding of the dynamics in the system and facilitate developing informed policies for decision-makers.

  10. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery

    PubMed Central

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  11. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.

    PubMed

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level.

  12. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.

    PubMed

    Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M

    2015-09-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods.

  13. An Agent-Based Epidemic Simulation of Social Behaviors Affecting HIV Transmission among Taiwanese Homosexuals

    PubMed Central

    2015-01-01

    Computational simulations are currently used to identify epidemic dynamics, to test potential prevention and intervention strategies, and to study the effects of social behaviors on HIV transmission. The author describes an agent-based epidemic simulation model of a network of individuals who participate in high-risk sexual practices, using number of partners, condom usage, and relationship length to distinguish between high- and low-risk populations. Two new concepts—free links and fixed links—are used to indicate tendencies among individuals who either have large numbers of short-term partners or stay in long-term monogamous relationships. An attempt was made to reproduce epidemic curves of reported HIV cases among male homosexuals in Taiwan prior to using the agent-based model to determine the effects of various policies on epidemic dynamics. Results suggest that when suitable adjustments are made based on available social survey statistics, the model accurately simulates real-world behaviors on a large scale. PMID:25815047

  14. Agent Based Simulation Design for Aggregation and Disaggregation

    DTIC Science & Technology

    2011-12-01

    Development of a Generic Data-Driven Simulation.‖ In Proceed- ings of the 2010 Winter Simulation Conference, edited by B. Johansson, S. Jain, J. Montoya ...DARPA, Santa Monica, CA. Davis, P. and R. Hillestad. 1993. ―Families of Models that Cross Levels of Resolution: Issues for Design, Calibration and...Issues, and Prin- ciples.‖ RAND N-3400-DARPA, Santa Monica, CA. Department of Defense. 1995. ―Department of Defense Modeling and Simulation Master

  15. Automated multi-objective calibration of biological agent-based simulations.

    PubMed

    Read, Mark N; Alden, Kieran; Rose, Louis M; Timmis, Jon

    2016-09-01

    Computational agent-based simulation (ABS) is increasingly used to complement laboratory techniques in advancing our understanding of biological systems. Calibration, the identification of parameter values that align simulation with biological behaviours, becomes challenging as increasingly complex biological domains are simulated. Complex domains cannot be characterized by single metrics alone, rendering simulation calibration a fundamentally multi-metric optimization problem that typical calibration techniques cannot handle. Yet calibration is an essential activity in simulation-based science; the baseline calibration forms a control for subsequent experimentation and hence is fundamental in the interpretation of results. Here, we develop and showcase a method, built around multi-objective optimization, for calibrating ABSs against complex target behaviours requiring several metrics (termed objectives) to characterize. Multi-objective calibration (MOC) delivers those sets of parameter values representing optimal trade-offs in simulation performance against each metric, in the form of a Pareto front. We use MOC to calibrate a well-understood immunological simulation against both established a priori and previously unestablished target behaviours. Furthermore, we show that simulation-borne conclusions are broadly, but not entirely, robust to adopting baseline parameter values from different extremes of the Pareto front, highlighting the importance of MOC's identification of numerous calibration solutions. We devise a method for detecting overfitting in a multi-objective context, not previously possible, used to save computational effort by terminating MOC when no improved solutions will be found. MOC can significantly impact biological simulation, adding rigour to and speeding up an otherwise time-consuming calibration process and highlighting inappropriate biological capture by simulations that cannot be well calibrated. As such, it produces more accurate

  16. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    ERIC Educational Resources Information Center

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-01-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these…

  17. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    NASA Astrophysics Data System (ADS)

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-06-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these agents obey simple rules assigned or manipulated by the user (e.g., speeding up, slowing down, etc.). It is the interactions between these agents, based on the rules assigned by the user, that give rise to emergent, aggregate-level behavior (e.g., formation and movement of the traffic jam). Natural selection is such an emergent phenomenon, which has been shown to be challenging for novices (K16 students) to understand. Whereas prior research on learning evolutionary phenomena with MABMs has typically focused on high school students and beyond, we investigate how elementary students (4th graders) develop multi-level explanations of some introductory aspects of natural selection—species differentiation and population change—through scaffolded interactions with an MABM that simulates predator-prey dynamics in a simple birds-butterflies ecosystem. We conducted a semi-clinical interview based study with ten participants, in which we focused on the following: a) identifying the nature of learners' initial interpretations of salient events or elements of the represented phenomena, b) identifying the roles these interpretations play in the development of their multi-level explanations, and c) how attending to different levels of the relevant phenomena can make explicit different mechanisms to the learners. In addition, our analysis also shows that although there were differences between high- and low-performing students (in terms of being able to explain population-level behaviors) in the pre-test, these differences disappeared in the post-test.

  18. An agent-based computational model for tuberculosis spreading on age-structured populations

    NASA Astrophysics Data System (ADS)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  19. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  20. Deriving effective vaccine allocation strategies for pandemic influenza: Comparison of an agent-based simulation and a compartmental model

    PubMed Central

    Dalgıç, Özden O.; Özaltın, Osman Y.; Ciccotelli, William A.; Erenay, Fatih S.

    2017-01-01

    Individuals are prioritized based on their risk profiles when allocating limited vaccine stocks during an influenza pandemic. Computationally expensive but realistic agent-based simulations and fast but stylized compartmental models are typically used to derive effective vaccine allocation strategies. A detailed comparison of these two approaches, however, is often omitted. We derive age-specific vaccine allocation strategies to mitigate a pandemic influenza outbreak in Seattle by applying derivative-free optimization to an agent-based simulation and also to a compartmental model. We compare the strategies derived by these two approaches under various infection aggressiveness and vaccine coverage scenarios. We observe that both approaches primarily vaccinate school children, however they may allocate the remaining vaccines in different ways. The vaccine allocation strategies derived by using the agent-based simulation are associated with up to 70% decrease in total cost and 34% reduction in the number of infections compared to the strategies derived by using the compartmental model. Nevertheless, the latter approach may still be competitive for very low and/or very high infection aggressiveness. Our results provide insights about potential differences between the vaccine allocation strategies derived by using agent-based simulations and those derived by using compartmental models. PMID:28222123

  1. A Scaffolding Framework to Support Learning of Emergent Phenomena Using Multi-Agent-Based Simulation Environments

    NASA Astrophysics Data System (ADS)

    Basu, Satabdi; Sengupta, Pratim; Biswas, Gautam

    2015-04-01

    Students from middle school to college have difficulties in interpreting and understanding complex systems such as ecological phenomena. Researchers have suggested that students experience difficulties in reconciling the relationships between individuals, populations, and species, as well as the interactions between organisms and their environment in the ecosystem. Multi-agent-based computational models (MABMs) can explicitly capture agents and their interactions by representing individual actors as computational objects with assigned rules. As a result, the collective aggregate-level behavior of the population dynamically emerges from simulations that generate the aggregation of these interactions. Past studies have used a variety of scaffolds to help students learn ecological phenomena. Yet, there is no theoretical framework that supports the systematic design of scaffolds to aid students' learning in MABMs. Our paper addresses this issue by proposing a comprehensive framework for the design, analysis, and evaluation of scaffolding to support students' learning of ecology in a MABM. We present a study in which middle school students used a MABM to investigate and learn about a desert ecosystem. We identify the different types of scaffolds needed to support inquiry learning activities in this simulation environment and use our theoretical framework to demonstrate the effectiveness of our scaffolds in helping students develop a deep understanding of the complex ecological behaviors represented in the simulation..

  2. Agent-Based Knowledge Discovery for Modeling and Simulation

    SciTech Connect

    Haack, Jereme N.; Cowell, Andrew J.; Marshall, Eric J.; Fligg, Alan K.; Gregory, Michelle L.; McGrath, Liam R.

    2009-09-15

    This paper describes an approach to using agent technology to extend the automated discovery mechanism of the Knowledge Encapsulation Framework (KEF). KEF is a suite of tools to enable the linking of knowledge inputs (relevant, domain-specific evidence) to modeling and simulation projects, as well as other domains that require an effective collaborative workspace for knowledge-based tasks. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a semantic wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  3. A Conceptual Framework for Representing Human Behavior Characteristics in a System of Systems Agent-Based Survivability Simulation

    DTIC Science & Technology

    2010-11-22

    distribution is unlimited. A CONCEPTUAL FRAMEWORK FOR REPRESENTING HUMAN BEHAVIOR CHARACTERISTICS IN A SYSTEM OF SYSTEMS AGENT-BASED SURVIVABILITY...27411 -0001 ABSTRACT A CONCEPTUAL FRAMEWORK FOR REPRESENTING HUMAN BEHAVIOR CHARACTERISTICS IN A SYSTEM OF SYSTEMS AGENT-BASED SURVIVABILITY SIMULATION...TITLE AND SUBTITLE A CONCEPTUAL FRAMEWORK FOR REPRESENTING HUMAN BEHAVIOR CHARACTERISTICS IN A SYSTEM OF SYSTEMS AGENT-BASED SURVIVABILITY

  4. Efficient Allocation of Resources for Defense of Spatially Distributed Networks Using Agent-Based Simulation.

    PubMed

    Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A

    2015-09-01

    This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach.

  5. An agent-based simulation of extirpation of Ceratitis capitata applied to invasions in California

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We describe and validate an Agent-Based Simulation(ABS) of invasive insects and use it to investigate the time to extirpation of Ceratitis capitata using data from seven outbreaks that occurred in California from 2008-2010. Results are compared with the length of intervention and quarantine imposed ...

  6. UAV Swarm Tactics: An Agent-Based Simulation and Markov Process Analysis

    DTIC Science & Technology

    2013-06-01

    NPS NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS UAV SWARM TACTICS: AN AGENT-BASED SIM ULATION AND MARKOV PROCESS ANALYSIS Thesis...inina; tho d ... _ . aod oompiotinl and ~_i .. tho roIloction of .. form. tion. Sond oommonu fQPrdina; thi!; burdon Mlim. m 0< a ny <>tho< ...,oct...TACTICS : AN AGENT-BASED SIMULATION AND MARKOV PROCESS ANALYSIS 1 So. " NUMBER ’. AU 1 Sd. PROJECT Uwe Gaertner 1 s.. TASK NUMBER 1 sr. WORK UNIT

  7. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised.

  8. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass

    PubMed Central

    Sánchez, Jesús M.; Carrera, Álvaro; Iglesias, Carlos Á.; Serrano, Emilio

    2016-01-01

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services. PMID:27563911

  9. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass.

    PubMed

    Sánchez, Jesús M; Carrera, Álvaro; Iglesias, Carlos Á; Serrano, Emilio

    2016-08-24

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services.

  10. Comparing stochastic differential equations and agent-based modelling and simulation for early-stage cancer.

    PubMed

    Figueredo, Grazziela P; Siebers, Peer-Olaf; Owen, Markus R; Reps, Jenna; Aickelin, Uwe

    2014-01-01

    There is great potential to be explored regarding the use of agent-based modelling and simulation as an alternative paradigm to investigate early-stage cancer interactions with the immune system. It does not suffer from some limitations of ordinary differential equation models, such as the lack of stochasticity, representation of individual behaviours rather than aggregates and individual memory. In this paper we investigate the potential contribution of agent-based modelling and simulation when contrasted with stochastic versions of ODE models using early-stage cancer examples. We seek answers to the following questions: (1) Does this new stochastic formulation produce similar results to the agent-based version? (2) Can these methods be used interchangeably? (3) Do agent-based models outcomes reveal any benefit when compared to the Gillespie results? To answer these research questions we investigate three well-established mathematical models describing interactions between tumour cells and immune elements. These case studies were re-conceptualised under an agent-based perspective and also converted to the Gillespie algorithm formulation. Our interest in this work, therefore, is to establish a methodological discussion regarding the usability of different simulation approaches, rather than provide further biological insights into the investigated case studies. Our results show that it is possible to obtain equivalent models that implement the same mechanisms; however, the incapacity of the Gillespie algorithm to retain individual memory of past events affects the similarity of some results. Furthermore, the emergent behaviour of ABMS produces extra patters of behaviour in the system, which was not obtained by the Gillespie algorithm.

  11. Biophysically realistic filament bending dynamics in agent-based biological simulation.

    PubMed

    Alberts, Jonathan B

    2009-01-01

    An appealing tool for study of the complex biological behaviors that can emerge from networks of simple molecular interactions is an agent-based, computational simulation that explicitly tracks small-scale local interactions--following thousands to millions of states through time. For many critical cell processes (e.g. cytokinetic furrow specification, nuclear centration, cytokinesis), the flexible nature of cytoskeletal filaments is likely to be critical. Any computer model that hopes to explain the complex emergent behaviors in these processes therefore needs to encode filament flexibility in a realistic manner. Here I present a numerically convenient and biophysically realistic method for modeling cytoskeletal filament flexibility in silico. Each cytoskeletal filament is represented by a series of rigid segments linked end-to-end in series with a variable attachment point for the translational elastic element. This connection scheme allows an empirically tuning, for a wide range of segment sizes, viscosities, and time-steps, that endows any filament species with the experimentally observed (or theoretically expected) static force deflection, relaxation time-constant, and thermal writhing motions. I additionally employ a unique pair of elastic elements--one representing the axial and the other the bending rigidity- that formulate the restoring force in terms of single time-step constraint resolution. This method is highly local -adjacent rigid segments of a filament only interact with one another through constraint forces-and is thus well-suited to simulations in which arbitrary additional forces (e.g. those representing interactions of a filament with other bodies or cross-links / entanglements between filaments) may be present. Implementation in code is straightforward; Java source code is available at www.celldynamics.org.

  12. iCrowd: agent-based behavior modeling and crowd simulator

    NASA Astrophysics Data System (ADS)

    Kountouriotis, Vassilios I.; Paterakis, Manolis; Thomopoulos, Stelios C. A.

    2016-05-01

    Initially designed in the context of the TASS (Total Airport Security System) FP-7 project, the Crowd Simulation platform developed by the Integrated Systems Lab of the Institute of Informatics and Telecommunications at N.C.S.R. Demokritos, has evolved into a complete domain-independent agent-based behavior simulator with an emphasis on crowd behavior and building evacuation simulation. Under continuous development, it reflects an effort to implement a modern, multithreaded, data-oriented simulation engine employing latest state-of-the-art programming technologies and paradigms. It is based on an extensible architecture that separates core services from the individual layers of agent behavior, offering a concrete simulation kernel designed for high-performance and stability. Its primary goal is to deliver an abstract platform to facilitate implementation of several Agent-Based Simulation solutions with applicability in several domains of knowledge, such as: (i) Crowd behavior simulation during [in/out] door evacuation. (ii) Non-Player Character AI for Game-oriented applications and Gamification activities. (iii) Vessel traffic modeling and simulation for Maritime Security and Surveillance applications. (iv) Urban and Highway Traffic and Transportation Simulations. (v) Social Behavior Simulation and Modeling.

  13. Applications of agent-based simulation for human socio-cultural behavior modeling.

    PubMed

    Jiang, Hong; Karwowski, Waldemar; Ahram, Tareq Z

    2012-01-01

    Agent-based modeling and simulation (ABMS) has gained wide attention over the past few years. ABMS is a powerful simulation modeling technique that has a number of applications, including applications to real-world business problems [1]. This modeling technique has been used by scientists to analyze complex system-level behavior by simulating the system from the bottom up. The major application of ABMS includes social, political, biology, and economic sciences. This paper provides an overview of ABMS applications with the emphasis on modeling human socio-cultural behavior (HSCB).

  14. Collaborative Multi-Agent Based Simulations: Stakeholder-Focused Innovation in Water Resources Management and Decision-Support Modeling

    NASA Astrophysics Data System (ADS)

    Kock, B. E.

    2006-12-01

    The combined use of multi-agent based simulations and collaborative modeling approaches is emerging as a highly effective tool for representing complex coupled social-biophysical water resource systems. A collaboratively-designed, multi-agent based simulation can be used both as a decision-support tool and as a didactic method for improving stakeholder understanding and engagement with water resources policymaking and management. Major technical and non-technical obstacles remain to the efficient and effective development of multi-agent models of human society, to integrating these models with GIS and other numerical models, and to building a process for engaging stakeholders with model design, implementation and use. It is proposed here to tackle some of these obstacles through a collaborative multi-agent based simulation process framework, intended for practical use in resolving disputes and environmental challenges over sustainable irrigated agriculture in the Western United States. A practical implementation of this framework will be conducted in collaboration with a diverse stakeholder group representing farmers and local, state and federal water managers. Through the use of simulation gaming, interviewing and computer-based knowledge elicitation, a multi-agent model representing local and regional social dynamics will be developed to support the acceptable and sustainable implementation of management alternatives for reducing regional problems of salinization and high selenium concentrations in soils and irrigation water. The development of a socially and scientifically credible simulation platform in this setting can make a significant contribution to ensuring the non-adversarial use of high quality science, enhance the engagement of stakeholders with policymaking, and help meet the challenges of integrating dynamic models of human society with more traditional biophysical systems models.

  15. An Agent-Based Model of New Venture Creation: Conceptual Design for Simulating Entrepreneurship

    NASA Technical Reports Server (NTRS)

    Provance, Mike; Collins, Andrew; Carayannis, Elias

    2012-01-01

    There is a growing debate over the means by which regions can foster the growth of entrepreneurial activity in order to stimulate recovery and growth of their economies. On one side, agglomeration theory suggests the regions grow because of strong clusters that foster knowledge spillover locally; on the other side, the entrepreneurial action camp argues that innovative business models are generated by entrepreneurs with unique market perspectives who draw on knowledge from more distant domains. We will show you the design for a novel agent-based model of new venture creation that will demonstrate the relationship between agglomeration and action. The primary focus of this model is information exchange as the medium for these agent interactions. Our modeling and simulation study proposes to reveal interesting relationships in these perspectives, offer a foundation on which these disparate theories from economics and sociology can find common ground, and expand the use of agent-based modeling into entrepreneurship research.

  16. Using an agent-based model to simulate children’s active travel to school

    PubMed Central

    2013-01-01

    Background Despite the multiple advantages of active travel to school, only a small percentage of US children and adolescents walk or bicycle to school. Intervention studies are in a relatively early stage and evidence of their effectiveness over long periods is limited. The purpose of this study was to illustrate the utility of agent-based models in exploring how various policies may influence children’s active travel to school. Methods An agent-based model was developed to simulate children’s school travel behavior within a hypothetical city. The model was used to explore the plausible implications of policies targeting two established barriers to active school travel: long distance to school and traffic safety. The percent of children who walk to school was compared for various scenarios. Results To maximize the percent of children who walk to school the school locations should be evenly distributed over space and children should be assigned to the closest school. In the case of interventions to improve traffic safety, targeting a smaller area around the school with greater intensity may be more effective than targeting a larger area with less intensity. Conclusions Despite the challenges they present, agent based models are a useful complement to other analytical strategies in studying the plausible impact of various policies on active travel to school. PMID:23705953

  17. An agent based model for simulating the spread of sexually transmitted infections.

    PubMed

    Rutherford, Grant; Friesen, Marcia R; McLeod, Robert D

    2012-01-01

    This work uses agent-based modelling (ABM) to simulate sexually transmitted infection (STIs) spread within a population of 1000 agents over a 10-year period, as a preliminary investigation of the suitability of ABM methodology to simulate STI spread. The work contrasts compartmentalized mathematical models that fail to account for individual agents, and ABMs commonly applied to simulate the spread of respiratory infections. The model was developed in C++ using the Boost 1.47.0 libraries for the normal distribution and OpenGL for visualization. Sixteen agent parameters interact individually and in combination to govern agent profiles and behaviours relative to infection probabilities. The simulation results provide qualitative comparisons of STI mitigation strategies, including the impact of condom use, promiscuity, the form of the friend network, and mandatory STI testing. Individual and population-wide impacts were explored, with individual risk being impacted much more dramatically by population-level behaviour changes as compared to individual behaviour changes.

  18. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE PAGES

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  19. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    SciTech Connect

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  20. Agent-based computational model investigates muscle-specific responses to disuse-induced atrophy

    PubMed Central

    Martin, Kyle S.; Peirce, Shayn M.

    2015-01-01

    Skeletal muscle is highly responsive to use. In particular, muscle atrophy attributable to decreased activity is a common problem among the elderly and injured/immobile. However, each muscle does not respond the same way. We developed an agent-based model that generates a tissue-level skeletal muscle response to disuse/immobilization. The model incorporates tissue-specific muscle fiber architecture parameters and simulates changes in muscle fiber size as a result of disuse-induced atrophy that are consistent with published experiments. We created simulations of 49 forelimb and hindlimb muscles of the rat by incorporating eight fiber-type and size parameters to explore how these parameters, which vary widely across muscles, influence sensitivity to disuse-induced atrophy. Of the 49 muscles modeled, the soleus exhibited the greatest atrophy after 14 days of simulated immobilization (51% decrease in fiber size), whereas the extensor digitorum communis atrophied the least (32%). Analysis of these simulations revealed that both fiber-type distribution and fiber-size distribution influence the sensitivity to disuse atrophy even though no single tissue architecture parameter correlated with atrophy rate. Additionally, software agents representing fibroblasts were incorporated into the model to investigate cellular interactions during atrophy. Sensitivity analyses revealed that fibroblast agents have the potential to affect disuse-induced atrophy, albeit with a lesser effect than fiber type and size. In particular, muscle atrophy elevated slightly with increased initial fibroblast population and increased production of TNF-α. Overall, the agent-based model provides a novel framework for investigating both tissue adaptations and cellular interactions in skeletal muscle during atrophy. PMID:25722379

  1. ModelforAnalyzing Human Communication Network Based onAgent-Based Simulation

    NASA Astrophysics Data System (ADS)

    Matsuyama, Shinako; Terano, Takao

    This paper discusses dynamic properties of human communications networks, which appears as a result of informationexchanges among people. We propose agent-based simulation (ABS) to examine implicit mechanisms behind the dynamics. The ABS enables us to reveal the characteristics and the differences of the networks regarding the specific communicationgroups. We perform experiments on the ABS with activity data from questionnaires survey and with virtual data which isdifferent from the activity data. We compare the difference between them and show the effectiveness of the ABS through theexperiments.

  2. An Agent-Based Labor Market Simulation with Endogenous Skill-Demand

    NASA Astrophysics Data System (ADS)

    Gemkow, S.

    This paper considers an agent-based labor market simulation to examine the influence of skills on wages and unemployment rates. Therefore less and highly skilled workers as well as less and highly productive vacancies are implemented. The skill distribution is exogenous whereas the distribution of the less and highly productive vacancies is endogenous. The different opportunities of the skill groups on the labor market are established by skill requirements. This means that a highly productive vacancy can only be filled by a highly skilled unemployed. Different skill distributions, which can also be interpreted as skill-biased technological change, are simulated by incrementing the skill level of highly skilled persons exogenously. This simulation also provides a microeconomic foundation of the matching function often used in theoretical approaches.

  3. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    SciTech Connect

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2014-01-01

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From these five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.

  4. Modeling the Information Age Combat Model: An Agent-Based Simulation of Network Centric Operations

    NASA Technical Reports Server (NTRS)

    Deller, Sean; Rabadi, Ghaith A.; Bell, Michael I.; Bowling, Shannon R.; Tolk, Andreas

    2010-01-01

    The Information Age Combat Model (IACM) was introduced by Cares in 2005 to contribute to the development of an understanding of the influence of connectivity on force effectiveness that can eventually lead to quantitative prediction and guidelines for design and employment. The structure of the IACM makes it clear that the Perron-Frobenius Eigenvalue is a quantifiable metric with which to measure the organization of a networked force. The results of recent experiments presented in Deller, et aI., (2009) indicate that the value of the Perron-Frobenius Eigenvalue is a significant measurement of the performance of an Information Age combat force. This was accomplished through the innovative use of an agent-based simulation to model the IACM and represents an initial contribution towards a new generation of combat models that are net-centric instead of using the current platform-centric approach. This paper describes the intent, challenges, design, and initial results of this agent-based simulation model.

  5. Simulating the elimination of sleeping sickness with an agent-based model

    PubMed Central

    Grébaut, Pascal; Girardin, Killian; Fédérico, Valentine; Bousquet, François

    2016-01-01

    Although Human African Trypanosomiasis is largely considered to be in the process of extinction today, the persistence of human and animal reservoirs, as well as the vector, necessitates a laborious elimination process. In this context, modeling could be an effective tool to evaluate the ability of different public health interventions to control the disease. Using the Cormas® system, we developed HATSim, an agent-based model capable of simulating the possible endemic evolutions of sleeping sickness and the ability of National Control Programs to eliminate the disease. This model takes into account the analysis of epidemiological, entomological, and ecological data from field studies conducted during the last decade, making it possible to predict the evolution of the disease within this area over a 5-year span. In this article, we first present HATSim according to the Overview, Design concepts, and Details (ODD) protocol that is classically used to describe agent-based models, then, in a second part, we present predictive results concerning the evolution of Human African Trypanosomiasis in the village of Lambi (Cameroon), in order to illustrate the interest of such a tool. Our results are consistent with what was observed in the field by the Cameroonian National Control Program (CNCP). Our simulations also revealed that regular screening can be sufficient, although vector control applied to all areas with human activities could be significantly more efficient. Our results indicate that the current model can already help decision-makers in planning the elimination of the disease in foci. PMID:28008825

  6. A Multi Agent-Based Framework for Simulating Household PHEV Distribution and Electric Distribution Network Impact

    SciTech Connect

    Cui, Xiaohui; Liu, Cheng; Kim, Hoe Kyoung; Kao, Shih-Chieh; Tuttle, Mark A; Bhaduri, Budhendra L

    2011-01-01

    The variation of household attributes such as income, travel distance, age, household member, and education for different residential areas may generate different market penetration rates for plug-in hybrid electric vehicle (PHEV). Residential areas with higher PHEV ownership could increase peak electric demand locally and require utilities to upgrade the electric distribution infrastructure even though the capacity of the regional power grid is under-utilized. Estimating the future PHEV ownership distribution at the residential household level can help us understand the impact of PHEV fleet on power line congestion, transformer overload and other unforeseen problems at the local residential distribution network level. It can also help utilities manage the timing of recharging demand to maximize load factors and utilization of existing distribution resources. This paper presents a multi agent-based simulation framework for 1) modeling spatial distribution of PHEV ownership at local residential household level, 2) discovering PHEV hot zones where PHEV ownership may quickly increase in the near future, and 3) estimating the impacts of the increasing PHEV ownership on the local electric distribution network with different charging strategies. In this paper, we use Knox County, TN as a case study to show the simulation results of the agent-based model (ABM) framework. However, the framework can be easily applied to other local areas in the US.

  7. Promoting Conceptual Change for Complex Systems Understanding: Outcomes of an Agent-Based Participatory Simulation

    NASA Astrophysics Data System (ADS)

    Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.

    2016-08-01

    Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence ( r = .26, p = .03), Order ( r = .37, p = .002), and Tradeoffs ( r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.

  8. Multi-Agent-Based Simulation of a Complex Ecosystem of Mental Health Care.

    PubMed

    Kalton, Alan; Falconer, Erin; Docherty, John; Alevras, Dimitris; Brann, David; Johnson, Kyle

    2016-02-01

    This paper discusses the creation of an Agent-Based Simulation that modeled the introduction of care coordination capabilities into a complex system of care for patients with Serious and Persistent Mental Illness. The model describes the engagement between patients and the medical, social and criminal justice services they interact with in a complex ecosystem of care. We outline the challenges involved in developing the model, including process mapping and the collection and synthesis of data to support parametric estimates, and describe the controls built into the model to support analysis of potential changes to the system. We also describe the approach taken to calibrate the model to an observable level of system performance. Preliminary results from application of the simulation are provided to demonstrate how it can provide insights into potential improvements deriving from introduction of care coordination technology.

  9. From Agents to Continuous Change via Aesthetics: Learning Mechanics with Visual Agent-Based Computational Modeling

    ERIC Educational Resources Information Center

    Sengupta, Pratim; Farris, Amy Voss; Wright, Mason

    2012-01-01

    Novice learners find motion as a continuous process of change challenging to understand. In this paper, we present a pedagogical approach based on agent-based, visual programming to address this issue. Integrating agent-based programming, in particular, Logo programming, with curricular science has been shown to be challenging in previous research…

  10. The Agent-based Approach: A New Direction for Computational Models of Development.

    ERIC Educational Resources Information Center

    Schlesinger, Matthew; Parisi, Domenico

    2001-01-01

    Introduces the concepts of online and offline sampling and highlights the role of online sampling in agent-based models of learning and development. Compares the strengths of each approach for modeling particular developmental phenomena and research questions. Describes a recent agent-based model of infant causal perception. Discusses limitations…

  11. Agent-Based Crowd Simulation Considering Emotion Contagion for Emergency Evacuation Problem

    NASA Astrophysics Data System (ADS)

    Faroqi, H.; Mesgari, M.-S.

    2015-12-01

    During emergencies, emotions greatly affect human behaviour. For more realistic multi-agent systems in simulations of emergency evacuations, it is important to incorporate emotions and their effects on the agents. In few words, emotional contagion is a process in which a person or group influences the emotions or behavior of another person or group through the conscious or unconscious induction of emotion states and behavioral attitudes. In this study, we simulate an emergency situation in an open square area with three exits considering Adults and Children agents with different behavior. Also, Security agents are considered in order to guide Adults and Children for finding the exits and be calm. Six levels of emotion levels are considered for each agent in different scenarios and situations. The agent-based simulated model initialize with the random scattering of agent populations and then when an alarm occurs, each agent react to the situation based on its and neighbors current circumstances. The main goal of each agent is firstly to find the exit, and then help other agents to find their ways. Numbers of exited agents along with their emotion levels and damaged agents are compared in different scenarios with different initialization in order to evaluate the achieved results of the simulated model. NetLogo 5.2 is used as the multi-agent simulation framework with R language as the developing language.

  12. Changing crops in response to climate: virtual Nang Rong, Thailand in an agent based simulation.

    PubMed

    Malanson, George P; Verdery, Ashton M; Walsh, Stephen J; Sawangdee, Yothin; Heumann, Benjamin W; McDaniel, Philip M; Frizzelle, Brian G; Williams, Nathalie E; Yao, Xiaozheng; Entwisle, Barbara; Rindfuss, Ronald R

    2014-09-01

    The effects of extended climatic variability on agricultural land use were explored for the type of system found in villages of northeastern Thailand. An agent based model developed for the Nang Rong district was used to simulate land allotted to jasmine rice, heavy rice, cassava, and sugar cane. The land use choices in the model depended on likely economic outcomes, but included elements of bounded rationality in dependence on household demography. The socioeconomic dynamics are endogenous in the system, and climate changes were added as exogenous drivers. Villages changed their agricultural effort in many different ways. Most villages reduced the amount of land under cultivation, primarily with reduction in jasmine rice, but others did not. The variation in responses to climate change indicates potential sensitivity to initial conditions and path dependence for this type of system. The differences between our virtual villages and the real villages of the region indicate effects of bounded rationality and limits on model applications.

  13. Agent-based simulation as a tool for the built environment.

    PubMed

    Gaudiano, Paolo

    2013-08-01

    There is a growing need to increase the performance of the built environment through a combination of improved design, retrofitting of existing structures, and behavioral and policy change. Increased performance includes decreasing construction and operational costs, improving efficiency, reducing energy consumption and overall carbon footprint, and increasing the health, safety, and comfort of building occupants. Data collection and analysis are central to ongoing efforts in performance improvement. The growth of sensor and monitoring technologies, coupled with the proliferation of building automation systems, is quickly leading to an explosion in the amount, quality, and format of building performance data. What is needed are methodologies for extracting viable information from these data and using the results to effect meaningful change. Furthermore, occupant behavior and attitudes must be taken into account. This paper summarizes agent-based simulation and describes its potential as an approach to support analysis, design, and performance improvements in the built environment.

  14. Changing crops in response to climate: virtual Nang Rong, Thailand in an agent based simulation

    PubMed Central

    Malanson, George P.; Verdery, Ashton M.; Walsh, Stephen J.; Sawangdee, Yothin; Heumann, Benjamin W.; McDaniel, Philip M.; Frizzelle, Brian G.; Williams, Nathalie E.; Yao, Xiaozheng; Entwisle, Barbara; Rindfuss, Ronald R.

    2014-01-01

    The effects of extended climatic variability on agricultural land use were explored for the type of system found in villages of northeastern Thailand. An agent based model developed for the Nang Rong district was used to simulate land allotted to jasmine rice, heavy rice, cassava, and sugar cane. The land use choices in the model depended on likely economic outcomes, but included elements of bounded rationality in dependence on household demography. The socioeconomic dynamics are endogenous in the system, and climate changes were added as exogenous drivers. Villages changed their agricultural effort in many different ways. Most villages reduced the amount of land under cultivation, primarily with reduction in jasmine rice, but others did not. The variation in responses to climate change indicates potential sensitivity to initial conditions and path dependence for this type of system. The differences between our virtual villages and the real villages of the region indicate effects of bounded rationality and limits on model applications. PMID:25061240

  15. An agent-based simulation model to study accountable care organizations.

    PubMed

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions.

  16. Prediction Markets and Beliefs about Climate: Results from Agent-Based Simulations

    NASA Astrophysics Data System (ADS)

    Gilligan, J. M.; John, N. J.; van der Linden, M.

    2015-12-01

    Climate scientists have long been frustrated by persistent doubts a large portion of the public expresses toward the scientific consensus about anthropogenic global warming. The political and ideological polarization of this doubt led Vandenbergh, Raimi, and Gilligan [1] to propose that prediction markets for climate change might influence the opinions of those who mistrust the scientific community but do trust the power of markets.We have developed an agent-based simulation of a climate prediction market in which traders buy and sell future contracts that will pay off at some future year with a value that depends on the global average temperature at that time. The traders form a heterogeneous population with different ideological positions, different beliefs about anthropogenic global warming, and different degrees of risk aversion. We also vary characteristics of the market, including the topology of social networks among the traders, the number of traders, and the completeness of the market. Traders adjust their beliefs about climate according to the gains and losses they and other traders in their social network experience. This model predicts that if global temperature is predominantly driven by greenhouse gas concentrations, prediction markets will cause traders' beliefs to converge toward correctly accepting anthropogenic warming as real. This convergence is largely independent of the structure of the market and the characteristics of the population of traders. However, it may take considerable time for beliefs to converge. Conversely, if temperature does not depend on greenhouse gases, the model predicts that traders' beliefs will not converge. We will discuss the policy-relevance of these results and more generally, the use of agent-based market simulations for policy analysis regarding climate change, seasonal agricultural weather forecasts, and other applications.[1] MP Vandenbergh, KT Raimi, & JM Gilligan. UCLA Law Rev. 61, 1962 (2014).

  17. Design of a Mobile Agent-Based Adaptive Communication Middleware for Federations of Critical Infrastructure Simulations

    NASA Astrophysics Data System (ADS)

    Görbil, Gökçe; Gelenbe, Erol

    The simulation of critical infrastructures (CI) can involve the use of diverse domain specific simulators that run on geographically distant sites. These diverse simulators must then be coordinated to run concurrently in order to evaluate the performance of critical infrastructures which influence each other, especially in emergency or resource-critical situations. We therefore describe the design of an adaptive communication middleware that provides reliable and real-time one-to-one and group communications for federations of CI simulators over a wide-area network (WAN). The proposed middleware is composed of mobile agent-based peer-to-peer (P2P) overlays, called virtual networks (VNets), to enable resilient, adaptive and real-time communications over unreliable and dynamic physical networks (PNets). The autonomous software agents comprising the communication middleware monitor their performance and the underlying PNet, and dynamically adapt the P2P overlay and migrate over the PNet in order to optimize communications according to the requirements of the federation and the current conditions of the PNet. Reliable communications is provided via redundancy within the communication middleware and intelligent migration of agents over the PNet. The proposed middleware integrates security methods in order to protect the communication infrastructure against attacks and provide privacy and anonymity to the participants of the federation. Experiments with an initial version of the communication middleware over a real-life networking testbed show that promising improvements can be obtained for unicast and group communications via the agent migration capability of our middleware.

  18. Real-Time Agent-Based Modeling Simulation with in-situ Visualization of Complex Biological Systems

    PubMed Central

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.

    2016-01-01

    We present an efficient and scalable scheme for implementing agent-based modeling (ABM) simulation with In Situ visualization of large complex systems on heterogeneous computing platforms. The scheme is designed to make optimal use of the resources available on a heterogeneous platform consisting of a multicore CPU and a GPU, resulting in minimal to no resource idle time. Furthermore, the scheme was implemented under a client-server paradigm that enables remote users to visualize and analyze simulation data as it is being generated at each time step of the model. Performance of a simulation case study of vocal fold inflammation and wound healing with 3.8 million agents shows 35× and 7× speedup in execution time over single-core and multi-core CPU respectively. Each iteration of the model took less than 200 ms to simulate, visualize and send the results to the client. This enables users to monitor the simulation in real-time and modify its course as needed. PMID:27547508

  19. Investigating the role of water in the Diffusion of Cholera using Agent-Based simulation

    NASA Astrophysics Data System (ADS)

    Augustijn, Ellen-Wien; Doldersum, Tom; Augustijn, Denie

    2014-05-01

    Traditionally, cholera was considered to be a waterborne disease. Currently we know that many other factors can contribute to the spread of this disease including human mobility and human behavior. However, the hydrological component in cholera diffusion is significant. The interplay between cholera and water includes bacteria (V. cholera) that survive in the aquatic environment, the possibility that run-off water from dumpsites carries the bacteria to surface water (rivers and lakes), and when the bacteria reach streams they can be carried downstream to infect new locations. Modelling is a very important tool to build theory on the interplay between different types of transmission mechanisms that together are responsible for the spread of Cholera. Agent-based simulation models are very suitable to incorporate behavior at individual level and to reproduce emergence. However, it is more difficult to incorporate the hydrological components in this type of model. In this research we present the hydrological component of an Agent-Based Cholera model developed to study a Cholera epidemic in Kumasi (Ghana) in 2005. The model was calibrated on the relative contribution of each community to the distributed pattern of cholera rather than the absolute number of incidences. Analysis of the results shows that water plays an important role in the diffusion of cholera: 75% of the cholera cases were infected via river water that was contaminated by runoff from the dumpsites. To initiate infections upstream, the probability of environment-to-human transmission seemed to be overestimated compared to what may be expected from literature. Scenario analyses show that there is a strong relation between the epidemic curve and the rainfall. Removing dumpsites that are situated close to the river resulted in a strong decrease in the number of cholera cases. Results are sensitive to the scheduling of the daily activities and the survival time of the cholera bacteria.

  20. Impact of urban planning on household's residential decisions: An agent-based simulation model for Vienna☆

    PubMed Central

    Gaube, Veronika; Remesch, Alexander

    2013-01-01

    Interest in assessing the sustainability of socio-ecological systems of urban areas has increased notably, with additional attention generated due to the fact that half the world's population now lives in cities. Urban areas face both a changing urban population size and increasing sustainability issues in terms of providing good socioeconomic and environmental living conditions. Urban planning has to deal with both challenges. Households play a major role by being affected by urban planning decisions on the one hand and by being responsible – among many other factors – for the environmental performance of a city (e.g. energy use). We here present an agent-based decision model referring to the city of Vienna, the capital of Austria, with a population of about 1.7 million (2.3 million within the metropolitan area, the latter being more than 25% of Austria's total population). Since the early 1990s, after decades of negative population growth, Vienna has been experiencing a steady increase in population, mainly driven by immigration. The aim of the agent-based decision model is to simulate new residential patterns of different household types based on demographic development and migration scenarios. Model results were used to assess spatial patterns of energy use caused by different household types in the four scenarios (1) conventional urban planning, (2) sustainable urban planning, (3) expensive centre and (4) no green area preference. Outcomes show that changes in preferences of households relating to the presence of nearby green areas have the most important impact on the distribution of households across the small-scaled city area. Additionally, the results demonstrate the importance of the distribution of different household types regarding spatial patterns of energy use. PMID:27667962

  1. Impact of urban planning on household's residential decisions: An agent-based simulation model for Vienna.

    PubMed

    Gaube, Veronika; Remesch, Alexander

    2013-07-01

    Interest in assessing the sustainability of socio-ecological systems of urban areas has increased notably, with additional attention generated due to the fact that half the world's population now lives in cities. Urban areas face both a changing urban population size and increasing sustainability issues in terms of providing good socioeconomic and environmental living conditions. Urban planning has to deal with both challenges. Households play a major role by being affected by urban planning decisions on the one hand and by being responsible - among many other factors - for the environmental performance of a city (e.g. energy use). We here present an agent-based decision model referring to the city of Vienna, the capital of Austria, with a population of about 1.7 million (2.3 million within the metropolitan area, the latter being more than 25% of Austria's total population). Since the early 1990s, after decades of negative population growth, Vienna has been experiencing a steady increase in population, mainly driven by immigration. The aim of the agent-based decision model is to simulate new residential patterns of different household types based on demographic development and migration scenarios. Model results were used to assess spatial patterns of energy use caused by different household types in the four scenarios (1) conventional urban planning, (2) sustainable urban planning, (3) expensive centre and (4) no green area preference. Outcomes show that changes in preferences of households relating to the presence of nearby green areas have the most important impact on the distribution of households across the small-scaled city area. Additionally, the results demonstrate the importance of the distribution of different household types regarding spatial patterns of energy use.

  2. Agent-Based Spatiotemporal Simulation of Biomolecular Systems within the Open Source MASON Framework

    PubMed Central

    Pérez-Rodríguez, Gael; Pérez-Pérez, Martín; Glez-Peña, Daniel; Azevedo, Nuno F.; Lourenço, Anália

    2015-01-01

    Agent-based modelling is being used to represent biological systems with increasing frequency and success. This paper presents the implementation of a new tool for biomolecular reaction modelling in the open source Multiagent Simulator of Neighborhoods framework. The rationale behind this new tool is the necessity to describe interactions at the molecular level to be able to grasp emergent and meaningful biological behaviour. We are particularly interested in characterising and quantifying the various effects that facilitate biocatalysis. Enzymes may display high specificity for their substrates and this information is crucial to the engineering and optimisation of bioprocesses. Simulation results demonstrate that molecule distributions, reaction rate parameters, and structural parameters can be adjusted separately in the simulation allowing a comprehensive study of individual effects in the context of realistic cell environments. While higher percentage of collisions with occurrence of reaction increases the affinity of the enzyme to the substrate, a faster reaction (i.e., turnover number) leads to a smaller number of time steps. Slower diffusion rates and molecular crowding (physical hurdles) decrease the collision rate of reactants, hence reducing the reaction rate, as expected. Also, the random distribution of molecules affects the results significantly. PMID:25874228

  3. Evaluation of wholesale electric power market rules and financial risk management by agent-based simulations

    NASA Astrophysics Data System (ADS)

    Yu, Nanpeng

    As U.S. regional electricity markets continue to refine their market structures, designs and rules of operation in various ways, two critical issues are emerging. First, although much experience has been gained and costly and valuable lessons have been learned, there is still a lack of a systematic platform for evaluation of the impact of a new market design from both engineering and economic points of view. Second, the transition from a monopoly paradigm characterized by a guaranteed rate of return to a competitive market created various unfamiliar financial risks for various market participants, especially for the Investor Owned Utilities (IOUs) and Independent Power Producers (IPPs). This dissertation uses agent-based simulation methods to tackle the market rules evaluation and financial risk management problems. The California energy crisis in 2000-01 showed what could happen to an electricity market if it did not go through a comprehensive and rigorous testing before its implementation. Due to the complexity of the market structure, strategic interaction between the participants, and the underlying physics, it is difficult to fully evaluate the implications of potential changes to market rules. This dissertation presents a flexible and integrative method to assess market designs through agent-based simulations. Realistic simulation scenarios on a 225-bus system are constructed for evaluation of the proposed PJM-like market power mitigation rules of the California electricity market. Simulation results show that in the absence of market power mitigation, generation company (GenCo) agents facilitated by Q-learning are able to exploit the market flaws and make significantly higher profits relative to the competitive benchmark. The incorporation of PJM-like local market power mitigation rules is shown to be effective in suppressing the exercise of market power. The importance of financial risk management is exemplified by the recent financial crisis. In this

  4. An agent-based simulation of extirpation of Ceratitis capitata applied to invasions in California.

    PubMed

    Manoukis, Nicholas C; Hoffman, Kevin

    2014-01-01

    We present an agent-based simulation (ABS) of Ceratitis capitata ("Medfly") developed for estimating the time to extirpation of this pest in areas where quarantines and eradication treatments were immediately imposed. We use the ABS, implemented in the program MED-FOES, to study seven different outbreaks that occurred in Southern California from 2008 to 2010. Results are compared with the length of intervention and quarantine imposed by the State, based on a linear developmental model (thermal unit accumulation, or "degree-day"). MED-FOES is a useful tool for invasive species managers as it incorporates more information from the known biology of the Medfly, and includes the important feature of being demographically explicit, providing significant improvements over simple degree-day calculations. While there was general agreement between the length of quarantine by degree-day and the time to extirpation indicated by MED-FOES, the ABS suggests that the margin of safety varies among cases and that in two cases the quarantine may have been excessively long. We also examined changes in the number of individuals over time in MED-FOES and conducted a sensitivity analysis for one of the outbreaks to explore the role of various input parameters on simulation outcomes. While our implementation of the ABS in this work is motivated by C. capitata and takes extirpation as a postulate, the simulation is very flexible and can be used to study a variety of questions on the invasion biology of pest insects and methods proposed to manage or eradicate such species.

  5. Semantic Agent-Based Service Middleware and Simulation for Smart Cities.

    PubMed

    Liu, Ming; Xu, Yang; Hu, Haixiao; Mohammed, Abdul-Wahid

    2016-12-21

    With the development of Machine-to-Machine (M2M) technology, a variety of embedded and mobile devices is integrated to interact via the platform of the Internet of Things, especially in the domain of smart cities. One of the primary challenges is that selecting the appropriate services or service combination for upper layer applications is hard, which is due to the absence of a unified semantical service description pattern, as well as the service selection mechanism. In this paper, we define a semantic service representation model from four key properties: Capability (C), Deployment (D), Resource (R) and IOData (IO). Based on this model, an agent-based middleware is built to support semantic service enablement. In this middleware, we present an efficient semantic service discovery and matching approach for a service combination process, which calculates the semantic similarity between services, and a heuristic algorithm to search the service candidates for a specific service request. Based on this design, we propose a simulation of virtual urban fire fighting, and the experimental results manifest the feasibility and efficiency of our design.

  6. Semantic Agent-Based Service Middleware and Simulation for Smart Cities

    PubMed Central

    Liu, Ming; Xu, Yang; Hu, Haixiao; Mohammed, Abdul-Wahid

    2016-01-01

    With the development of Machine-to-Machine (M2M) technology, a variety of embedded and mobile devices is integrated to interact via the platform of the Internet of Things, especially in the domain of smart cities. One of the primary challenges is that selecting the appropriate services or service combination for upper layer applications is hard, which is due to the absence of a unified semantical service description pattern, as well as the service selection mechanism. In this paper, we define a semantic service representation model from four key properties: Capability (C), Deployment (D), Resource (R) and IOData (IO). Based on this model, an agent-based middleware is built to support semantic service enablement. In this middleware, we present an efficient semantic service discovery and matching approach for a service combination process, which calculates the semantic similarity between services, and a heuristic algorithm to search the service candidates for a specific service request. Based on this design, we propose a simulation of virtual urban fire fighting, and the experimental results manifest the feasibility and efficiency of our design. PMID:28009818

  7. A framework for the use of agent based modeling to simulate ...

    EPA Pesticide Factsheets

    Simulation of human behavior in exposure modeling is a complex task. Traditionally, inter-individual variation in human activity has been modeled by drawing from a pool of single day time-activity diaries such as the US EPA Consolidated Human Activity Database (CHAD). Here, an agent-based model (ABM) is used to simulate population distributions of longitudinal patterns of four macro activities (sleeping, eating, working, and commuting) in populations of adults over a period of one year. In this ABM, an individual is modeled as an agent whose movement through time and space is determined by a set of decision rules. The rules are based on the agent having time-varying “needs” that are satisfied by performing actions. Needs are modeled as increasing over time, and taking an action reduces the need. Need-satisfying actions include sleeping (meeting the need for rest), eating (meeting the need for food), and commuting/working (meeting the need for income). Every time an action is completed, the model determines the next action the agent will take based on the magnitude of each of the agent’s needs at that point in time. Different activities advertise their ability to satisfy various needs of the agent (such as food to eat or sleeping in a bed or on a couch). The model then chooses the activity that satisfies the greatest of the agent’s needs. When multiple actions could address a need, the model will choose the most effective of the actions (bed over the couc

  8. Agent-based evacuation simulation for spatial allocation assessment of urban shelters

    NASA Astrophysics Data System (ADS)

    Yu, Jia; Wen, Jiahong; Jiang, Yong

    2015-12-01

    The construction of urban shelters is one of the most important work in urban planning and disaster prevention. The spatial allocation assessment is a fundamental pre-step for spatial location-allocation of urban shelters. This paper introduces a new method which makes use of agent-based technology to implement evacuation simulation so as to conduct dynamic spatial allocation assessment of urban shelters. The method can not only accomplish traditional geospatial evaluation for urban shelters, but also simulate the evacuation process of the residents to shelters. The advantage of utilizing this method lies into three aspects: (1) the evacuation time of each citizen from a residential building to the shelter can be estimated more reasonably; (2) the total evacuation time of all the residents in a region is able to be obtained; (3) the road congestions in evacuation in sheltering can be detected so as to take precautionary measures to prevent potential risks. In this study, three types of agents are designed: shelter agents, government agents and resident agents. Shelter agents select specified land uses as shelter candidates for different disasters. Government agents delimitate the service area of each shelter, in other words, regulate which shelter a person should take, in accordance with the administrative boundaries and road distance between the person's position and the location of the shelter. Resident agents have a series of attributes, such as ages, positions, walking speeds, and so on. They also have several behaviors, such as reducing speed when walking in the crowd, helping old people and children, and so on. Integrating these three types of agents which are correlated with each other, evacuation procedures can be simulated and dynamic allocation assessment of shelters will be achieved. A case study in Jing'an District, Shanghai, China, was conducted to demonstrate the feasibility of the method. A scenario of earthquake disaster which occurs in nighttime

  9. Simulating Brain Tumor Heterogeneity with a Multiscale Agent-Based Model: Linking Molecular Signatures, Phenotypes and Expansion Rate

    PubMed Central

    Zhang, Le; Strouthos, Costas G.; Wang, Zhihui; Deisboeck, Thomas S.

    2008-01-01

    We have extended our previously developed 3D multi-scale agent-based brain tumor model to simulate cancer heterogeneity and to analyze its impact across the scales of interest. While our algorithm continues to employ an epidermal growth factor receptor (EGFR) gene-protein interaction network to determine the cells’ phenotype, it now adds an implicit treatment of tumor cell adhesion related to the model’s biochemical microenvironment. We simulate a simplified tumor progression pathway that leads to the emergence of five distinct glioma cell clones with different EGFR density and cell ‘search precisions’. The in silico results show that microscopic tumor heterogeneity can impact the tumor system’s multicellular growth patterns. Our findings further confirm that EGFR density results in the more aggressive clonal populations switching earlier from proliferation-dominated to a more migratory phenotype. Moreover, analyzing the dynamic molecular profile that triggers the phenotypic switch between proliferation and migration, our in silico oncogenomics data display spatial and temporal diversity in documenting the regional impact of tumorigenesis, and thus support the added value of multi-site and repeated assessments in vitro and in vivo. Potential implications from this in silico work for experimental and computational studies are discussed. PMID:20047002

  10. Autonomous Agent-Based Simulation of a Model Simulating the Human Air-Threat Assessment Process

    DTIC Science & Technology

    2004-03-01

    multi - agent system (MAS) technology and is implemented in Java programming language. This research is a portion of Red Intent Project whose goal is to ultimately implement a model to predict the intent of any given track in the environment. For any air track in the simulation, two sets of agents are created, one for controlling track actions and one for predicting its identity and intent based on information received from track, the geopolitical situation and intelligence. The simulation is also capable of identifying coordinated actions between air tracks. We

  11. An extensible simulation environment and movement metrics for testing walking behavior in agent-based models

    SciTech Connect

    Paul M. Torrens; Atsushi Nara; Xun Li; Haojie Zhu; William A. Griffin; Scott B. Brown

    2012-01-01

    Human movement is a significant ingredient of many social, environmental, and technical systems, yet the importance of movement is often discounted in considering systems complexity. Movement is commonly abstracted in agent-based modeling (which is perhaps the methodological vehicle for modeling complex systems), despite the influence of movement upon information exchange and adaptation in a system. In particular, agent-based models of urban pedestrians often treat movement in proxy form at the expense of faithfully treating movement behavior with realistic agency. There exists little consensus about which method is appropriate for representing movement in agent-based schemes. In this paper, we examine popularly-used methods to drive movement in agent-based models, first by introducing a methodology that can flexibly handle many representations of movement at many different scales and second, introducing a suite of tools to benchmark agent movement between models and against real-world trajectory data. We find that most popular movement schemes do a relatively poor job of representing movement, but that some schemes may well be 'good enough' for some applications. We also discuss potential avenues for improving the representation of movement in agent-based frameworks.

  12. A Parallel Sliding Region Algorithm to Make Agent-Based Modeling Possible for a Large-Scale Simulation: Modeling Hepatitis C Epidemics in Canada.

    PubMed

    Wong, William W L; Feng, Zeny Z; Thein, Hla-Hla

    2016-11-01

    Agent-based models (ABMs) are computer simulation models that define interactions among agents and simulate emergent behaviors that arise from the ensemble of local decisions. ABMs have been increasingly used to examine trends in infectious disease epidemiology. However, the main limitation of ABMs is the high computational cost for a large-scale simulation. To improve the computational efficiency for large-scale ABM simulations, we built a parallelizable sliding region algorithm (SRA) for ABM and compared it to a nonparallelizable ABM. We developed a complex agent network and performed two simulations to model hepatitis C epidemics based on the real demographic data from Saskatchewan, Canada. The first simulation used the SRA that processed on each postal code subregion subsequently. The second simulation processed the entire population simultaneously. It was concluded that the parallelizable SRA showed computational time saving with comparable results in a province-wide simulation. Using the same method, SRA can be generalized for performing a country-wide simulation. Thus, this parallel algorithm enables the possibility of using ABM for large-scale simulation with limited computational resources.

  13. An Advanced Computational Approach to System of Systems Analysis & Architecting Using Agent-Based Behavioral Model

    DTIC Science & Technology

    2012-09-30

    The Wave Process Model provides a framework for an agent-based modeling methodology, which is used to abstract the non- utopian behavioral aspects...that SoS participants exhibit nominal behavior ( utopian behavior), but deviation from nominal motivation leads to complications and disturbances in... characteristics . Figure 2 outlines the agent architecture components and flow of information among components. Own Process Control (OPC) Cooperation

  14. Agent based simulation on the process of human flesh search-From perspective of knowledge and emotion

    NASA Astrophysics Data System (ADS)

    Zhu, Hou; Hu, Bin

    2017-03-01

    Human flesh search as a new net crowed behavior, on the one hand can help us to find some special information, on the other hand may lead to privacy leaking and offending human right. In order to study the mechanism of human flesh search, this paper proposes a simulation model based on agent-based model and complex networks. The computational experiments show some useful results. Discovered information quantity and involved personal ratio are highly correlated, and most of net citizens will take part in the human flesh search or will not take part in the human flesh search. Knowledge quantity does not influence involved personal ratio, but influences whether HFS can find out the target human. When the knowledge concentrates on hub nodes, the discovered information quantity is either perfect or almost zero. Emotion of net citizens influences both discovered information quantity and involved personal ratio. Concretely, when net citizens are calm to face the search topic, it will be hardly to find out the target; But when net citizens are agitated, the target will be found out easily.

  15. Agent-based modeling of the spread of influenza-like illness in an emergency department: a simulation study.

    PubMed

    Laskowski, Marek; Demianyk, Bryan C P; Witt, Julia; Mukhi, Shamir N; Friesen, Marcia R; McLeod, Robert D

    2011-11-01

    The objective of this paper was to develop an agent-based modeling framework in order to simulate the spread of influenza virus infection on a layout based on a representative hospital emergency department in Winnipeg, Canada. In doing so, the study complements mathematical modeling techniques for disease spread, as well as modeling applications focused on the spread of antibiotic-resistant nosocomial infections in hospitals. Twenty different emergency department scenarios were simulated, with further simulation of four infection control strategies. The agent-based modeling approach represents systems modeling, in which the emergency department was modeled as a collection of agents (patients and healthcare workers) and their individual characteristics, behaviors, and interactions. The framework was coded in C++ using Qt4 libraries running under the Linux operating system. A simple ordinary least squares (OLS) regression was used to analyze the data, in which the percentage of patients that became infected in one day within the simulation was the dependent variable. The results suggest that within the given instance context, patient-oriented infection control policies (alternate treatment streams, masking symptomatic patients) tend to have a larger effect than policies that target healthcare workers. The agent-based modeling framework is a flexible tool that can be made to reflect any given environment; it is also a decision support tool for practitioners and policymakers to assess the relative impact of infection control strategies. The framework illuminates scenarios worthy of further investigation, as well as counterintuitive findings.

  16. Proposal of Classification Method of Time Series Data in International Emissions Trading Market Using Agent-based Simulation

    NASA Astrophysics Data System (ADS)

    Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi

    This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.

  17. Simulating Land-Use Change using an Agent-Based Land Transaction Model

    NASA Astrophysics Data System (ADS)

    Bakker, M. M.; van Dijk, J.; Alam, S. J.

    2013-12-01

    In the densely populated cultural landscapes of Europe, the vast majority of all land is owned by private parties, be it farmers (the majority), nature organizations, property developers, or citizens. Therewith, the vast majority of all land-use change arises from land transactions between different owner types: successful farms expand at the expense of less successful farms, and meanwhile property developers, individual citizens, and nature organizations also actively purchase land. These land transactions are driven by specific properties of the land, by governmental policies, and by the (economic) motives of both buyers and sellers. Climate/global change can affect these drivers at various scales: at the local scale changes in hydrology can make certain land less or more desirable; at the global scale the agricultural markets will affect motives of farmers to buy or sell land; while at intermediate (e.g. provincial) scales property developers and nature conservationists may be encouraged or discouraged to purchase land. The cumulative result of all these transactions becomes manifest in changing land-use patterns, and consequent environmental responses. Within the project Climate Adaptation for Rural Areas an agent-based land-use model was developed that explores the future response of individual land users to climate change, within the context of wider global change (i.e. policy and market change). It simulates the exchange of land among farmers and between farmers and nature organizations and property developers, for a specific case study area in the east of the Netherlands. Results show that local impacts of climate change can result in a relative stagnation in the land market in waterlogged areas. Furthermore, the increase in dairying at the expense of arable cultivation - as has been observed in the area in the past - is slowing down as arable produce shows a favourable trend in the agricultural world market. Furthermore, budgets for nature managers are

  18. Impact of Different Policies on Unhealthy Dietary Behaviors in an Urban Adult Population: An Agent-Based Simulation Model

    PubMed Central

    Giabbanelli, Philippe J.; Arah, Onyebuchi A.; Zimmerman, Frederick J.

    2014-01-01

    Objectives. Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. Methods. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Results. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Conclusions. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems. PMID:24832414

  19. On-lattice agent-based simulation of populations of cells within the open-source Chaste framework.

    PubMed

    Figueredo, Grazziela P; Joshi, Tanvi V; Osborne, James M; Byrne, Helen M; Owen, Markus R

    2013-04-06

    Over the years, agent-based models have been developed that combine cell division and reinforced random walks of cells on a regular lattice, reaction-diffusion equations for nutrients and growth factors; and ordinary differential equations for the subcellular networks regulating the cell cycle. When linked to a vascular layer, this multiple scale model framework has been applied to tumour growth and therapy. Here, we report on the creation of an agent-based multi-scale environment amalgamating the characteristics of these models within a Virtual Physiological Human (VPH) Exemplar Project. This project enables reuse, integration, expansion and sharing of the model and relevant data. The agent-based and reaction-diffusion parts of the multi-scale model have been implemented and are available for download as part of the latest public release of Chaste (Cancer, Heart and Soft Tissue Environment; http://www.cs.ox.ac.uk/chaste/), part of the VPH Toolkit (http://toolkit.vph-noe.eu/). The environment functionalities are verified against the original models, in addition to extra validation of all aspects of the code. In this work, we present the details of the implementation of the agent-based environment, including the system description, the conceptual model, the development of the simulation model and the processes of verification and validation of the simulation results. We explore the potential use of the environment by presenting exemplar applications of the 'what if' scenarios that can easily be studied in the environment. These examples relate to tumour growth, cellular competition for resources and tumour responses to hypoxia (low oxygen levels). We conclude our work by summarizing the future steps for the expansion of the current system.

  20. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGES

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  1. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    SciTech Connect

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease states in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.

  2. The effects of social interactions on fertility decline in nineteenth-century France: an agent-based simulation experiment.

    PubMed

    González-Bailón, Sandra; Murphy, Tommy E

    2013-07-01

    We built an agent-based simulation, incorporating geographic and demographic data from nineteenth-century France, to study the role of social interactions in fertility decisions. The simulation made experimentation possible in a context where other empirical strategies were precluded by a lack of data. We evaluated how different decision rules, with and without interdependent decision-making, caused variations in population growth and fertility levels. The analyses show that incorporating social influence into the model allows empirically observed behaviour to be mimicked, especially at a national level. These findings shed light on individual-level mechanisms through which the French demographic transition may have developed.

  3. Multi-Agent Based Simulation of Optimal Urban Land Use Allocation in the Middle Reaches of the Yangtze River, China

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Huang, W.; Jin, W.; Li, S.

    2016-06-01

    The optimization of land-use allocation is one of important approaches to achieve regional sustainable development. This study selects Chang-Zhu-Tan agglomeration as study area and proposed a new land use optimization allocation model. Using multi-agent based simulation model, the future urban land use optimization allocation was simulated in 2020 and 2030 under three different scenarios. This kind of quantitative information about urban land use optimization allocation and urban expansions in future would be of great interest to urban planning, water and land resource management, and climate change research.

  4. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department

    PubMed Central

    Kittipittayakorn, Cholada

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  5. Diffusion dynamics and concentration of toxic materials from quantum dots-based nanotechnologies: an agent-based modeling simulation framework

    NASA Astrophysics Data System (ADS)

    Agusdinata, Datu Buyung; Amouie, Mahbod; Xu, Tao

    2015-01-01

    Due to their favorable electrical and optical properties, quantum dots (QDs) nanostructures have found numerous applications including nanomedicine and photovoltaic cells. However, increased future production, use, and disposal of engineered QD products also raise concerns about their potential environmental impacts. The objective of this work is to establish a modeling framework for predicting the diffusion dynamics and concentration of toxic materials released from Trioctylphosphine oxide-capped CdSe. To this end, an agent-based model simulation with reaction kinetics and Brownian motion dynamics was developed. Reaction kinetics is used to model the stability of surface capping agent particularly due to oxidation process. The diffusion of toxic Cd2+ ions in aquatic environment was simulated using an adapted Brownian motion algorithm. A calibrated parameter to reflect sensitivity to reaction rate is proposed. The model output demonstrates the stochastic spatial distribution of toxic Cd2+ ions under different values of proxy environmental factor parameters. With the only chemistry considered was oxidation, the simulation was able to replicate Cd2+ ion release from Thiol-capped QDs in aerated water. The agent-based method is the first to be developed in the QDs application domain. It adds both simplicity of the solubility and rate of release of Cd2+ ions and complexity of tracking of individual atoms of Cd at the same time.

  6. Linking Bayesian and Agent-Based Models to Simulate Complex Social-Ecological Systems in the Sonoran Desert

    NASA Astrophysics Data System (ADS)

    Pope, A.; Gimblett, R.

    2013-12-01

    Interdependencies of ecologic, hydrologic, and social systems challenge traditional approaches to natural resource management in semi-arid regions. As a complex social-ecological system, water demands in the Sonoran Desert from agricultural and urban users often conflicts with water needs for its ecologically-significant riparian corridors. To explore this system, we developed an agent-based model to simulate complex feedbacks between human decisions and environmental conditions. Cognitive mapping in conjunction with stakeholder participation produced a Bayesian model of conditional probabilities of local human decision-making processes resulting to changes in water demand. Probabilities created in the Bayesian model were incorporated into the agent-based model, so that each agent had a unique probability to make a positive decision based on its perceived environment at each point in time and space. By using a Bayesian approach, uncertainty in the human decision-making process could be incorporated. The spatially-explicit agent-based model simulated changes in depth-to-groundwater by well pumping based on an agent's water demand. Depth-to-groundwater was then used as an indicator of unique vegetation guilds within the riparian corridor. Each vegetation guild provides varying levels of ecosystem services, the changes of which, along with changes in depth-to-groundwater, feedback to influence agent behavior. Using this modeling approach allowed us to examine resilience of semi-arid riparian corridors and agent behavior under various scenarios. The insight provided by the model contributes to understanding how specific interventions may alter the complex social-ecological system in the future.

  7. Acceptability of an Embodied Conversational Agent-based Computer Application for Hispanic Women

    PubMed Central

    Wells, Kristen J.; Vázquez-Otero, Coralia; Bredice, Marissa; Meade, Cathy D.; Chaet, Alexis; Rivera, Maria I.; Arroyo, Gloria; Proctor, Sara K.; Barnes, Laura E.

    2015-01-01

    There are few Spanish language interactive, technology-driven health education programs. Objectives of this feasibility study were to: 1) learn more about computer and technology usage among Hispanic women living in a rural community; and 2) evaluate acceptability of the concept of using an embodied conversational agent (ECA) computer application among this population. A survey about computer usage history and interest in computers was administered to a convenience sample of 26 women. A sample video prototype of a hospital discharge ECA was administered followed by questions to gauge opinion about the ECA. Data indicate women exhibited both a high level of computer experience and enthusiasm for the ECA. Feedback from community is essential to ensure equity in state of the art dissemination of health information. Hay algunos programas interactivos en español que usan la tecnología para educar sobre la salud. Los objetivos de este estudio fueron: 1) aprender más sobre el uso de computadoras y tecnología entre mujeres Hispanas que viven en comunidades rurales y 2) evaluar la aceptabilidad del concepto de usar un programa de computadora utilizando un agente de conversación encarnado (ECA) en esta población. Se administro una encuesta sobre el historial de uso y del interés de aprender sobre computadoras fue a 26 mujeres por muestreo de conveniencia. Un ejemplo del prototipo ECA en forma de video de un alta hospitalaria fue administrado y fue seguido por preguntas sobre la opinión que tenían del ECA. Los datos indican que las mujeres mostraron un alto nivel de experiencia con las computadoras y un alto nivel de entusiasmo sobre el ECA. La retroalimentación de la comunidad es esencial para asegurar equidad en la diseminación de información sobre la salud con tecnología de punta. PMID:26671558

  8. Can human-like Bots control collective mood: agent-based simulations of online chats

    NASA Astrophysics Data System (ADS)

    Tadić, Bosiljka; Šuvakov, Milovan

    2013-10-01

    Using an agent-based modeling approach, in this paper, we study self-organized dynamics of interacting agents in the presence of chat Bots. Different Bots with tunable ‘human-like’ attributes, which exchange emotional messages with agents, are considered, and the collective emotional behavior of agents is quantitatively analyzed. In particular, using detrended fractal analysis we determine persistent fluctuations and temporal correlations in time series of agent activity and statistics of avalanches carrying emotional messages of agents when Bots favoring positive/negative affects are active. We determine the impact of Bots and identify parameters that can modulate that impact. Our analysis suggests that, by these measures, the emotional Bots induce collective emotion among interacting agents by suitably altering the fractal characteristics of the underlying stochastic process. Positive emotion Bots are slightly more effective than negative emotion Bots. Moreover, Bots which periodically alternate between positive and negative emotion can enhance fluctuations in the system, leading to avalanches of agent messages that are reminiscent of self-organized critical states.

  9. Experimental Evaluation of Suitability of Selected Multi-Criteria Decision-Making Methods for Large-Scale Agent-Based Simulations

    PubMed Central

    2016-01-01

    Multi-criteria decision-making (MCDM) can be formally implemented by various methods. This study compares suitability of four selected MCDM methods, namely WPM, TOPSIS, VIKOR, and PROMETHEE, for future applications in agent-based computational economic (ACE) models of larger scale (i.e., over 10 000 agents in one geographical region). These four MCDM methods were selected according to their appropriateness for computational processing in ACE applications. Tests of the selected methods were conducted on four hardware configurations. For each method, 100 tests were performed, which represented one testing iteration. With four testing iterations conducted on each hardware setting and separated testing of all configurations with the–server parameter de/activated, altogether, 12800 data points were collected and consequently analyzed. An illustrational decision-making scenario was used which allows the mutual comparison of all of the selected decision making methods. Our test results suggest that although all methods are convenient and can be used in practice, the VIKOR method accomplished the tests with the best results and thus can be recommended as the most suitable for simulations of large-scale agent-based models. PMID:27806061

  10. An Economic Analysis of Strategies to Control Clostridium Difficile Transmission and Infection Using an Agent-Based Simulation Model

    PubMed Central

    Nelson, Richard E.; Jones, Makoto; Leecaster, Molly; Samore, Matthew H.; Ray, William; Huttner, Angela; Huttner, Benedikt; Khader, Karim; Stevens, Vanessa W.; Gerding, Dale; Schweizer, Marin L.; Rubin, Michael A.

    2016-01-01

    Background A number of strategies exist to reduce Clostridium difficile (C. difficile) transmission. We conducted an economic evaluation of “bundling” these strategies together. Methods We constructed an agent-based computer simulation of nosocomial C. difficile transmission and infection in a hospital setting. This model included the following components: interactions between patients and health care workers; room contamination via C. difficile shedding; C. difficile hand carriage and removal via hand hygiene; patient acquisition of C. difficile via contact with contaminated rooms or health care workers; and patient antimicrobial use. Six interventions were introduced alone and "bundled" together: (a) aggressive C. difficile testing; (b) empiric isolation and treatment of symptomatic patients; (c) improved adherence to hand hygiene and (d) contact precautions; (e) improved use of soap and water for hand hygiene; and (f) improved environmental cleaning. Our analysis compared these interventions using values representing 3 different scenarios: (1) base-case (BASE) values that reflect typical hospital practice, (2) intervention (INT) values that represent implementation of hospital-wide efforts to reduce C. diff transmission, and (3) optimal (OPT) values representing the highest expected results from strong adherence to the interventions. Cost parameters for each intervention were obtained from published literature. We performed our analyses assuming low, normal, and high C. difficile importation prevalence and transmissibility of C. difficile. Results INT levels of the “bundled” intervention were cost-effective at a willingness-to-pay threshold of $100,000/quality-adjusted life-year in all importation prevalence and transmissibility scenarios. OPT levels of intervention were cost-effective for normal and high importation prevalence and transmissibility scenarios. When analyzed separately, hand hygiene compliance, environmental decontamination, and empiric

  11. [Research on multi-agent based modeling and simulation of hospital system].

    PubMed

    Zhao, Junping; Yang, Hongqiao; Guo, Huayuan; Li, Yi; Zhang, Zhenjiang; Li, Shuzhang

    2010-12-01

    In this paper, the theory of complex adaptive system (CAS) and its modeling method are introduced. The complex characters of the hospital system is analyzed. The agile manufacturing and cell reconstruction technologies are used to reconstruct the hospital system. Then we set forth a research for simulation of hospital system based on the methodology of Multi-Agent technology and high level architecture (HLA). Finally, a simulation framework based on HLA for hospital system is presented.

  12. Geometry of behavioral spaces: A computational approach to analysis and understanding of agent based models and agent behaviors

    NASA Astrophysics Data System (ADS)

    Cenek, Martin; Dahl, Spencer K.

    2016-11-01

    Systems with non-linear dynamics frequently exhibit emergent system behavior, which is important to find and specify rigorously to understand the nature of the modeled phenomena. Through this analysis, it is possible to characterize phenomena such as how systems assemble or dissipate and what behaviors lead to specific final system configurations. Agent Based Modeling (ABM) is one of the modeling techniques used to study the interaction dynamics between a system's agents and its environment. Although the methodology of ABM construction is well understood and practiced, there are no computational, statistically rigorous, comprehensive tools to evaluate an ABM's execution. Often, a human has to observe an ABM's execution in order to analyze how the ABM functions, identify the emergent processes in the agent's behavior, or study a parameter's effect on the system-wide behavior. This paper introduces a new statistically based framework to automatically analyze agents' behavior, identify common system-wide patterns, and record the probability of agents changing their behavior from one pattern of behavior to another. We use network based techniques to analyze the landscape of common behaviors in an ABM's execution. Finally, we test the proposed framework with a series of experiments featuring increasingly emergent behavior. The proposed framework will allow computational comparison of ABM executions, exploration of a model's parameter configuration space, and identification of the behavioral building blocks in a model's dynamics.

  13. Agent Based Modeling and Simulation Framework for Supply Chain Risk Management

    DTIC Science & Technology

    2012-03-01

    timed Petri net based simulation (Tuncel and Alpan 2010), and Monte Carlo (White 1995, Wu and Olson 2008, and Schmitt and Singh 2009). More detail...benefit costs. (Li and Li 2008) 26 Chen, Zhou, and Hu propose an agent-oriented Petri net model for an inventory- scheduling model, with focus on the...problems of analysis and modeling of multi-agent systems. Petri net aims at researching the organization structure and dynamic behavior of a system

  14. Agent-Based Simulations of Malaria Transmissions with Applications to a Study Site in Thailand

    NASA Technical Reports Server (NTRS)

    Kiang, Richard K.; Adimi, Farida; Zollner, Gabriela E.; Coleman, Russell E.

    2006-01-01

    The dynamics of malaria transmission are driven by environmental, biotic and socioeconomic factors. Because of the geographic dependency of these factors and the complex interactions among them, it is difficult to generalize the key factors that perpetuate or intensify malaria transmission. Methods: Discrete event simulations were used for modeling the detailed interactions among the vector life cycle, sporogonic cycle and human infection cycle, under the explicit influences of selected extrinsic and intrinsic factors. Meteorological and environmental parameters may be derived from satellite data. The output of the model includes the individual infection status and the quantities normally observed in field studies, such as mosquito biting rates, sporozoite infection rates, gametocyte prevalence and incidence. Results were compared with mosquito vector and human malaria data acquired over 4.5 years (June 1999 - January 2004) in Kong Mong Tha, a remote village in Kanchanaburi Province, western Thailand. Results: Three years of transmissions of vivax and falciparum malaria were simulated for a hypothetical hamlet with approximately 1/7 of the study site population. The model generated results for a number of scenarios, including applications of larvicide and insecticide, asymptomatic cases receiving or not receiving treatment, blocking malaria transmission in mosquito vectors, and increasing the density of farm (host) animals in the hamlet. Transmission characteristics and trends in the simulated results are comparable to actual data collected at the study site.

  15. Bee++: An Object-Oriented, Agent-Based Simulator for Honey Bee Colonies.

    PubMed

    Betti, Matthew; LeClair, Josh; Wahl, Lindi M; Zamir, Mair

    2017-03-10

    We present a model and associated simulation package (www.beeplusplus.ca) to capture the natural dynamics of a honey bee colony in a spatially-explicit landscape, with temporally-variable, weather-dependent parameters. The simulation tracks bees of different ages and castes, food stores within the colony, pollen and nectar sources and the spatial position of individual foragers outside the hive. We track explicitly the intake of pesticides in individual bees and their ability to metabolize these toxins, such that the impact of sub-lethal doses of pesticides can be explored. Moreover, pathogen populations (in particular, Nosema apis, Nosema cerenae and Varroa mites) have been included in the model and may be introduced at any time or location. The ability to study interactions among pesticides, climate, biodiversity and pathogens in this predictive framework should prove useful to a wide range of researchers studying honey bee populations. To this end, the simulation package is written in open source, object-oriented code (C++) and can be easily modified by the user. Here, we demonstrate the use of the model by exploring the effects of sub-lethal pesticide exposure on the flight behaviour of foragers.

  16. Bee++: An Object-Oriented, Agent-Based Simulator for Honey Bee Colonies

    PubMed Central

    Betti, Matthew; LeClair, Josh; Wahl, Lindi M.; Zamir, Mair

    2017-01-01

    We present a model and associated simulation package (www.beeplusplus.ca) to capture the natural dynamics of a honey bee colony in a spatially-explicit landscape, with temporally-variable, weather-dependent parameters. The simulation tracks bees of different ages and castes, food stores within the colony, pollen and nectar sources and the spatial position of individual foragers outside the hive. We track explicitly the intake of pesticides in individual bees and their ability to metabolize these toxins, such that the impact of sub-lethal doses of pesticides can be explored. Moreover, pathogen populations (in particular, Nosema apis, Nosema cerenae and Varroa mites) have been included in the model and may be introduced at any time or location. The ability to study interactions among pesticides, climate, biodiversity and pathogens in this predictive framework should prove useful to a wide range of researchers studying honey bee populations. To this end, the simulation package is written in open source, object-oriented code (C++) and can be easily modified by the user. Here, we demonstrate the use of the model by exploring the effects of sub-lethal pesticide exposure on the flight behaviour of foragers. PMID:28287445

  17. An Empirical Agent-Based Model to Simulate the Adoption of Water Reuse Using the Social Amplification of Risk Framework.

    PubMed

    Kandiah, Venu; Binder, Andrew R; Berglund, Emily Z

    2017-01-11

    Water reuse can serve as a sustainable alternative water source for urban areas. However, the successful implementation of large-scale water reuse projects depends on community acceptance. Because of the negative perceptions that are traditionally associated with reclaimed water, water reuse is often not considered in the development of urban water management plans. This study develops a simulation model for understanding community opinion dynamics surrounding the issue of water reuse, and how individual perceptions evolve within that context, which can help in the planning and decision-making process. Based on the social amplification of risk framework, our agent-based model simulates consumer perceptions, discussion patterns, and their adoption or rejection of water reuse. The model is based on the "risk publics" model, an empirical approach that uses the concept of belief clusters to explain the adoption of new technology. Each household is represented as an agent, and parameters that define their behavior and attributes are defined from survey data. Community-level parameters-including social groups, relationships, and communication variables, also from survey data-are encoded to simulate the social processes that influence community opinion. The model demonstrates its capabilities to simulate opinion dynamics and consumer adoption of water reuse. In addition, based on empirical data, the model is applied to investigate water reuse behavior in different regions of the United States. Importantly, our results reveal that public opinion dynamics emerge differently based on membership in opinion clusters, frequency of discussion, and the structure of social networks.

  18. Real-Time Agent-Based Modeling Simulation with in-situ Visualization of Complex Biological Systems: A Case Study on Vocal Fold Inflammation and Healing.

    PubMed

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y K

    2016-05-01

    We present an efficient and scalable scheme for implementing agent-based modeling (ABM) simulation with In Situ visualization of large complex systems on heterogeneous computing platforms. The scheme is designed to make optimal use of the resources available on a heterogeneous platform consisting of a multicore CPU and a GPU, resulting in minimal to no resource idle time. Furthermore, the scheme was implemented under a client-server paradigm that enables remote users to visualize and analyze simulation data as it is being generated at each time step of the model. Performance of a simulation case study of vocal fold inflammation and wound healing with 3.8 million agents shows 35× and 7× speedup in execution time over single-core and multi-core CPU respectively. Each iteration of the model took less than 200 ms to simulate, visualize and send the results to the client. This enables users to monitor the simulation in real-time and modify its course as needed.

  19. The contribution of agent-based simulations to conservation management on a Natura 2000 site.

    PubMed

    Dupont, Hélène; Gourmelon, Françoise; Rouan, Mathias; Le Viol, Isabelle; Kerbiriou, Christian

    2016-03-01

    The conservation of biodiversity today must include the participation and support of local stakeholders. Natura 2000 can be considered as a conservation system that, in its application in most EU countries, relies on the participation of local stakeholders. Our study proposes a scientific method for participatory modelling, with the aim of contributing to the conservation management of habitats and species at a Natura 2000 site (Crozon Peninsula, Bretagne, France) that is representative of in landuse changes in coastal areas. We make use of companion modelling and its associated tools (scenario-planning, GIS, multi-agent modelling and simulations) to consider possible futures through the co-construction of management scenarios and the understanding of their consequences on different indicators of biodiversity status (habitats, avifauna, flora). The maintenance of human activities as they have been carried out since the creation of the Natura 2000s zone allows the biodiversity values to remain stable. Extensive agricultural activities have been shown to be essential to this maintenance, whereas management sustained by the multiplication of conservation actions brings about variable results according to the indicators. None of the scenarios has a positive incidence on the set of indicators. However, an understanding of the modelling system and the results of the simulations allow for the refining of the selection of conservation actions in relation to the species to be preserved.

  20. Evolutionary Agent-Based Simulation of the Introduction of New Technologies in Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Yliniemi, Logan; Agogino, Adrian K.; Tumer, Kagan

    2014-01-01

    Accurate simulation of the effects of integrating new technologies into a complex system is critical to the modernization of our antiquated air traffic system, where there exist many layers of interacting procedures, controls, and automation all designed to cooperate with human operators. Additions of even simple new technologies may result in unexpected emergent behavior due to complex human/ machine interactions. One approach is to create high-fidelity human models coming from the field of human factors that can simulate a rich set of behaviors. However, such models are difficult to produce, especially to show unexpected emergent behavior coming from many human operators interacting simultaneously within a complex system. Instead of engineering complex human models, we directly model the emergent behavior by evolving goal directed agents, representing human users. Using evolution we can predict how the agent representing the human user reacts given his/her goals. In this paradigm, each autonomous agent in a system pursues individual goals, and the behavior of the system emerges from the interactions, foreseen or unforeseen, between the agents/actors. We show that this method reflects the integration of new technologies in a historical case, and apply the same methodology for a possible future technology.

  1. ActivitySim: large-scale agent based activity generation for infrastructure simulation

    SciTech Connect

    Gali, Emmanuel; Eidenbenz, Stephan; Mniszewski, Sue; Cuellar, Leticia; Teuscher, Christof

    2008-01-01

    The United States' Department of Homeland Security aims to model, simulate, and analyze critical infrastructure and their interdependencies across multiple sectors such as electric power, telecommunications, water distribution, transportation, etc. We introduce ActivitySim, an activity simulator for a population of millions of individual agents each characterized by a set of demographic attributes that is based on US census data. ActivitySim generates daily schedules for each agent that consists of a sequence of activities, such as sleeping, shopping, working etc., each being scheduled at a geographic location, such as businesses or private residences that is appropriate for the activity type and for the personal situation of the agent. ActivitySim has been developed as part of a larger effort to understand the interdependencies among national infrastructure networks and their demand profiles that emerge from the different activities of individuals in baseline scenarios as well as emergency scenarios, such as hurricane evacuations. We present the scalable software engineering principles underlying ActivitySim, the socia-technical modeling paradigms that drive the activity generation, and proof-of-principle results for a scenario in the Twin Cities, MN area of 2.6 M agents.

  2. Exploring Tradeoffs in Demand-side and Supply-side Management of Urban Water Resources using Agent-based Modeling and Evolutionary Computation

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Berglund, E. Z.

    2015-12-01

    Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS) framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger (1) increases in the volume of water pumped through inter-basin transfers from an external reservoir and (2) drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  3. Fast revelation of the motif mode for a yeast protein interaction network through intelligent agent-based distributed computing.

    PubMed

    Lee, Wei-Po; Tzou, Wen-Shyong

    2010-09-01

    In the yeast protein-protein interaction network, motif mode, a collection of motifs of special combinations of protein nodes annotated by the molecular function terms of the Gene Ontology, has revealed differences in the conservation constraints within the same topology. In this study, by employing an intelligent agent-based distributed computing method, we are able to discover motif modes in a fast and adaptive manner. Moreover, by focusing on the highly evolutionarily conserved motif modes belonging to the same biological function, we find a large downshift in the distance between nodes belonging to the same motif mode compared with the whole, suggesting that nodes with the same motif mode tend to congregate in a network. Several motif modes with a high conservation of the motif constituents were revealed, but from a new perspective, including that with a three-node motif mode engaged in the protein fate and that with three four-node motif modes involved in the genome maintenance, cellular organization, and transcription. The network motif modes discovered from this method can be linked to the wealth of biological data which require further elucidation with regard to biological functions.

  4. An Agent-Based Simulation for Investigating the Impact of Stereotypes on Task-Oriented Group Formation

    NASA Astrophysics Data System (ADS)

    Maghami, Mahsa; Sukthankar, Gita

    In this paper, we introduce an agent-based simulation for investigating the impact of social factors on the formation and evolution of task-oriented groups. Task-oriented groups are created explicitly to perform a task, and all members derive benefits from task completion. However, even in cases when all group members act in a way that is locally optimal for task completion, social forces that have mild effects on choice of associates can have a measurable impact on task completion performance. In this paper, we show how our simulation can be used to model the impact of stereotypes on group formation. In our simulation, stereotypes are based on observable features, learned from prior experience, and only affect an agent's link formation preferences. Even without assuming stereotypes affect the agents' willingness or ability to complete tasks, the long-term modifications that stereotypes have on the agents' social network impair the agents' ability to form groups with sufficient diversity of skills, as compared to agents who form links randomly. An interesting finding is that this effect holds even in cases where stereotype preference and skill existence are completely uncorrelated.

  5. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach

    PubMed Central

    2016-01-01

    Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235

  6. An Agent-Based Model of a Hepatic Inflammatory Response to Salmonella: A Computational Study under a Large Set of Experimental Data

    PubMed Central

    Chapes, Stephen K.; Ben-Arieh, David; Wu, Chih-Hang

    2016-01-01

    We present an agent-based model (ABM) to simulate a hepatic inflammatory response (HIR) in a mouse infected by Salmonella that sometimes progressed to problematic proportions, known as “sepsis”. Based on over 200 published studies, this ABM describes interactions among 21 cells or cytokines and incorporates 226 experimental data sets and/or data estimates from those reports to simulate a mouse HIR in silico. Our simulated results reproduced dynamic patterns of HIR reported in the literature. As shown in vivo, our model also demonstrated that sepsis was highly related to the initial Salmonella dose and the presence of components of the adaptive immune system. We determined that high mobility group box-1, C-reactive protein, and the interleukin-10: tumor necrosis factor-α ratio, and CD4+ T cell: CD8+ T cell ratio, all recognized as biomarkers during HIR, significantly correlated with outcomes of HIR. During therapy-directed silico simulations, our results demonstrated that anti-agent intervention impacted the survival rates of septic individuals in a time-dependent manner. By specifying the infected species, source of infection, and site of infection, this ABM enabled us to reproduce the kinetics of several essential indicators during a HIR, observe distinct dynamic patterns that are manifested during HIR, and allowed us to test proposed therapy-directed treatments. Although limitation still exists, this ABM is a step forward because it links underlying biological processes to computational simulation and was validated through a series of comparisons between the simulated results and experimental studies. PMID:27556404

  7. An agent-based simulation model of patient choice of health care providers in accountable care organizations.

    PubMed

    Alibrahim, Abdullah; Wu, Shinyi

    2016-10-04

    Accountable care organizations (ACO) in the United States show promise in controlling health care costs while preserving patients' choice of providers. Understanding the effects of patient choice is critical in novel payment and delivery models like ACO that depend on continuity of care and accountability. The financial, utilization, and behavioral implications associated with a patient's decision to forego local health care providers for more distant ones to access higher quality care remain unknown. To study this question, we used an agent-based simulation model of a health care market composed of providers able to form ACO serving patients and embedded it in a conditional logit decision model to examine patients capable of choosing their care providers. This simulation focuses on Medicare beneficiaries and their congestive heart failure (CHF) outcomes. We place the patient agents in an ACO delivery system model in which provider agents decide if they remain in an ACO and perform a quality improving CHF disease management intervention. Illustrative results show that allowing patients to choose their providers reduces the yearly payment per CHF patient by $320, reduces mortality rates by 0.12 percentage points and hospitalization rates by 0.44 percentage points, and marginally increases provider participation in ACO. This study demonstrates a model capable of quantifying the effects of patient choice in a theoretical ACO system and provides a potential tool for policymakers to understand implications of patient choice and assess potential policy controls.

  8. Agent-based modeling to simulate contamination events and evaluate threat management strategies in water distribution systems.

    PubMed

    Zechman, Emily M

    2011-05-01

    In the event of contamination of a water distribution system, decisions must be made to mitigate the impact of the contamination and to protect public health. Making threat management decisions while a contaminant spreads through the network is a dynamic and interactive process. Response actions taken by the utility managers and water consumption choices made by the consumers will affect the hydraulics, and thus the spread of the contaminant plume, in the network. A modeling framework that allows the simulation of a contamination event under the effects of actions taken by utility managers and consumers will be a useful tool for the analysis of alternative threat mitigation and management strategies. This article presents a multiagent modeling framework that combines agent-based, mechanistic, and dynamic methods. Agents select actions based on a set of rules that represent an individual's autonomy, goal-based desires, and reaction to the environment and the actions of other agents. Consumer behaviors including ingestion, mobility, reduction of water demands, and word-of-mouth communication are simulated. Management strategies are evaluated, including opening hydrants to flush the contaminant and broadcasts. As actions taken by consumer agents and utility operators affect demands and flows in the system, the mechanistic model is updated. Management strategies are evaluated based on the exposure of the population to the contaminant. The framework is designed to consider the typical issues involved in water distribution threat management and provides valuable analysis of threat containment strategies for water distribution system contamination events.

  9. Agent-based Modeling to Simulate the Diffusion of Water-Efficient Innovations and the Emergence of Urban Water Sustainability

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Giacomoni, M.; Shafiee, M. E.; Berglund, E.

    2014-12-01

    The sustainability of water resources is threatened by urbanization, as increasing demands deplete water availability, and changes to the landscape alter runoff and the flow regime of receiving water bodies. Utility managers typically manage urban water resources through the use of centralized solutions, such as large reservoirs, which may be limited in their ability balance the needs of urbanization and ecological systems. Decentralized technologies, on the other hand, may improve the health of the water resources system and deliver urban water services. For example, low impact development technologies, such as rainwater harvesting, and water-efficient technologies, such as low-flow faucets and toilets, may be adopted by households to retain rainwater and reduce demands, offsetting the need for new centralized infrastructure. Decentralized technologies may create new complexities in infrastructure and water management, as decentralization depends on community behavior and participation beyond traditional water resources planning. Messages about water shortages and water quality from peers and the water utility managers can influence the adoption of new technologies. As a result, feedbacks between consumers and water resources emerge, creating a complex system. This research develops a framework to simulate the diffusion of water-efficient innovations and the sustainability of urban water resources, by coupling models of households in a community, hydrologic models of a water resources system, and a cellular automata model of land use change. Agent-based models are developed to simulate the land use and water demand decisions of individual households, and behavioral rules are encoded to simulate communication with other agents and adoption of decentralized technologies, using a model of the diffusion of innovation. The framework is applied for an illustrative case study to simulate water resources sustainability over a long-term planning horizon.

  10. Towards a conceptual multi-agent-based framework to simulate the spatial group decision-making process

    NASA Astrophysics Data System (ADS)

    Ghavami, Seyed Morsal; Taleai, Mohammad

    2017-04-01

    Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.

  11. Towards a conceptual multi-agent-based framework to simulate the spatial group decision-making process

    NASA Astrophysics Data System (ADS)

    Ghavami, Seyed Morsal; Taleai, Mohammad

    2016-11-01

    Most spatial problems are multi-actor, multi-issue and multi-phase in nature. In addition to their intrinsic complexity, spatial problems usually involve groups of actors from different organizational and cognitive backgrounds, all of whom participate in a social structure to resolve or reduce the complexity of a given problem. Hence, it is important to study and evaluate what different aspects influence the spatial problem resolution process. Recently, multi-agent systems consisting of groups of separate agent entities all interacting with each other have been put forward as appropriate tools to use to study and resolve such problems. In this study, then in order to generate a better level of understanding regarding the spatial problem group decision-making process, a conceptual multi-agent-based framework is used that represents and specifies all the necessary concepts and entities needed to aid group decision making, based on a simulation of the group decision-making process as well as the relationships that exist among the different concepts involved. The study uses five main influencing entities as concepts in the simulation process: spatial influence, individual-level influence, group-level influence, negotiation influence and group performance measures. Further, it explains the relationship among different concepts in a descriptive rather than explanatory manner. To illustrate the proposed framework, the approval process for an urban land use master plan in Zanjan—a provincial capital in Iran—is simulated using MAS, the results highlighting the effectiveness of applying an MAS-based framework when wishing to study the group decision-making process used to resolve spatial problems.

  12. An agent-based model to simulate tsetse fly distribution and control techniques: a case study in Nguruman, Kenya

    PubMed Central

    Lin, Shengpan; DeVisser, Mark H.; Messina, Joseph P.

    2015-01-01

    Background African trypanosomiasis, also known as “sleeping sickness” in humans and “nagana” in livestock is an important vector-borne disease in Sub-Saharan Africa. Control of trypanosomiasis has focused on eliminating the vector, the tsetse fly (Glossina, spp.). Effective tsetse fly control planning requires models to predict tsetse population and distribution changes over time and space. Traditional planning models have used statistical tools to predict tsetse distributions and have been hindered by limited field survey data. Methodology/Results We developed an Agent-Based Model (ABM) to provide timing and location information for tsetse fly control without presence/absence training data. The model is driven by daily remotely-sensed environment data. The model provides a flexible tool linking environmental changes with individual biology to analyze tsetse control methods such as aerial insecticide spraying, wild animal control, releasing irradiated sterile tsetse males, and land use and cover modification. Significance This is a bottom-up process-based model with freely available data as inputs that can be easily transferred to a new area. The tsetse population simulation more closely approximates real conditions than those using traditional statistical models making it a useful tool in tsetse fly control planning. PMID:26309347

  13. Multiobjective Decision Making Policies and Coordination Mechanisms in Hierarchical Organizations: Results of an Agent-Based Simulation

    PubMed Central

    2014-01-01

    This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926

  14. Modeling the 2014 Ebola Virus Epidemic - Agent-Based Simulations, Temporal Analysis and Future Predictions for Liberia and Sierra Leone.

    PubMed

    Siettos, Constantinos; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2015-03-09

    We developed an agent-based model to investigate the epidemic dynamics of Ebola virus disease (EVD) in Liberia and Sierra Leone from May 27 to December 21, 2014. The dynamics of the agent-based simulator evolve on small-world transmission networks of sizes equal to the population of each country, with adjustable densities to account for the effects of public health intervention policies and individual behavioral responses to the evolving epidemic. Based on time series of the official case counts from the World Health Organization (WHO), we provide estimates for key epidemiological variables by employing the so-called Equation-Free approach. The underlying transmission networks were characterized by rather random structures in the two countries with densities decreasing by ~19% from the early (May 27-early August) to the last period (mid October-December 21). Our estimates for the values of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate, are very close to the ones reported by the WHO Ebola response team during the early period of the epidemic (until September 14) that were calculated based on clinical data. Specifically, regarding the effective reproductive number Re, our analysis suggests that until mid October, Re was above 2.3 in both countries; from mid October to December 21, Re dropped well below unity in Liberia, indicating a saturation of the epidemic, while in Sierra Leone it was around 1.9, indicating an ongoing epidemic. Accordingly, a ten-week projection from December 21 estimated that the epidemic will fade out in Liberia in early March; in contrast, our results flashed a note of caution for Sierra Leone since the cumulative number of cases could reach as high as 18,000, and the number of deaths might exceed 5,000, by early March 2015. However, by processing the reported data of the very last period (December 21, 2014-January 18, 2015), we obtained more optimistic estimates indicative of a remission of

  15. Understanding coupled natural and human systems on fire prone landscapes: integrating wildfire simulation into an agent based planning system.

    NASA Astrophysics Data System (ADS)

    Barros, Ana; Ager, Alan; Preisler, Haiganoush; Day, Michelle; Spies, Tom; Bolte, John

    2015-04-01

    Agent-based models (ABM) allow users to examine the long-term effects of agent decisions in complex systems where multiple agents and processes interact. This framework has potential application to study the dynamics of coupled natural and human systems where multiple stimuli determine trajectories over both space and time. We used Envision, a landscape based ABM, to analyze long-term wildfire dynamics in a heterogeneous, multi-owner landscape in Oregon, USA. Landscape dynamics are affected by land management policies, actors decisions, and autonomous processes such as vegetation succession, wildfire, or at a broader scale, climate change. Key questions include: 1) How are landscape dynamics influenced by policies and institutions, and 2) How do land management policies and actor decisions interact to produce intended and unintended consequences with respect to wildfire on fire-prone landscapes. Applying Envision to address these questions required the development of a wildfire module that could accurately simulate wildfires on the heterogeneous landscapes within the study area in terms of replicating historical fire size distribution, spatial distribution and fire intensity. In this paper we describe the development and testing of a mechanistic fire simulation system within Envision and application of the model on a 3.2 million fire prone landscape in central Oregon USA. The core fire spread equations use the Minimum Travel Time algorithm developed by M Finney. The model operates on a daily time step and uses a fire prediction system based on the relationship between energy release component and historical fires. Specifically, daily wildfire probabilities and sizes are generated from statistical analyses of historical fires in relation to daily ERC values. The MTT was coupled with the vegetation dynamics module in Envision to allow communication between the respective subsystem and effectively model fire effects and vegetation dynamics after a wildfire. Canopy and

  16. Development Approaches Coupled with Verification and Validation Methodologies for Agent-Based Mission-Level Analytical Combat Simulations

    DTIC Science & Technology

    2004-03-01

    Where Computers Meet Biology, Vintage Books, a division of Random House, Inc.: New York NY. 117. Liu, Bing, Siew-Hwee Choo , Shee-Ling Lok, Sing-Meng...Simpkins, Scott D., Eugene P. Paulo, , and Lyn R. Whitaker (2001) “Case Study in Modeling and Simulation Validation Methodology”. Proceedings of

  17. Modelling Temporal Schedule of Urban Trains Using Agent-Based Simulation and NSGA2-BASED Multiobjective Optimization Approaches

    NASA Astrophysics Data System (ADS)

    Sahelgozin, M.; Alimohammadi, A.

    2015-12-01

    Increasing distances between locations of residence and services leads to a large number of daily commutes in urban areas. Developing subway systems has been taken into consideration of transportation managers as a response to this huge amount of travel demands. In developments of subway infrastructures, representing a temporal schedule for trains is an important task; because an appropriately designed timetable decreases Total passenger travel times, Total Operation Costs and Energy Consumption of trains. Since these variables are not positively correlated, subway scheduling is considered as a multi-criteria optimization problem. Therefore, proposing a proper solution for subway scheduling has been always a controversial issue. On the other hand, research on a phenomenon requires a summarized representation of the real world that is known as Model. In this study, it is attempted to model temporal schedule of urban trains that can be applied in Multi-Criteria Subway Schedule Optimization (MCSSO) problems. At first, a conceptual framework is represented for MCSSO. Then, an agent-based simulation environment is implemented to perform Sensitivity Analysis (SA) that is used to extract the interrelations between the framework components. These interrelations is then taken into account in order to construct the proposed model. In order to evaluate performance of the model in MCSSO problems, Tehran subway line no. 1 is considered as the case study. Results of the study show that the model was able to generate an acceptable distribution of Pareto-optimal solutions which are applicable in the real situations while solving a MCSSO is the goal. Also, the accuracy of the model in representing the operation of subway systems was significant.

  18. Simulating the time series of a selected gene expression profile in an agent-based tumor model

    NASA Astrophysics Data System (ADS)

    Mansury, Yuri; Deisboeck, Thomas S.

    2004-09-01

    To elucidate the role of environmental conditions in molecular-level dynamics and to study their impact on macroscopic brain tumor growth patterns, the expression of the genes Tenascin C and PCNA in a 2D agent-based model for the migratory trait is calibrated using experimental data from the literature, while the expression of these genes for the proliferative trait is obtained as the model output. Numerical results confirm that the gene expression of Tenascin C is indeed consistently higher in the migratory glioma cell phenotype and show that the expression of PCNA is consistently higher among proliferating tumor cells. Intriguingly, the time series of the tumor cells’ gene expression exhibit a sudden change in behavior during the invasion of the tumor into a nutrient-abundant region, showing a robust positive correlation between the expression of Tenascin C and the tumor’s diameter, yet a strong negative correlation between the expression of PCNA and the diameter. These molecular-level dynamics correspond to the emergence of a structural asymmetry in the form of a bulging tumor rim in the nutrient-abundant region. The simulated time series thus supports the critical role of the migratory cell phenotype during both the tumor system’s overall macroscopic expansion and the evolvement of regional growth patterns, particularly in the later stages. Furthermore, detrended fluctuation analysis (DFA) suggests that for prediction purposes, the simulated gene expression profiles of Tenascin C and PCNA that were determined separately for the migrating and proliferating phenotypes exhibit lesser predictability than those of the phenotypic mixture combining all viable tumor cells typically found in clinical biopsies. Finally, partitioning the tumor into distinct geographic regions of interest (ROI) reveals that the gene expression profile of tumor cells in the quadrant close to the nutrient-abundant region is representative for the entire tumor whereas the expression

  19. Computationally efficient multibody simulations

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  20. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  1. Accelerator simulation using computers

    SciTech Connect

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a multi-track'' simulation and analysis code can be used for these applications.

  2. Accelerator simulation using computers

    SciTech Connect

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ``multi-track`` simulation and analysis code can be used for these applications.

  3. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  4. Computer-simulated phacoemulsification

    NASA Astrophysics Data System (ADS)

    Laurell, Carl-Gustaf; Nordh, Leif; Skarman, Eva; Andersson, Mats; Nordqvist, Per

    2001-06-01

    Phacoemulsification makes the cataract operation easier for the patient but involves a demanding technique for the surgeon. It is therefore important to increase the quality of surgical training in order to shorten the learning period for the beginner. This should diminish the risks of the patient. We are developing a computer-based simulator for training of phacoemulsification. The simulator is built on a platform that can be used as a basis for several different training simulators. A prototype has been made that has been partly tested by experienced surgeons.

  5. Agent-based modelling in synthetic biology

    PubMed Central

    2016-01-01

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. PMID:27903820

  6. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  7. Agent Based Modeling Applications for Geosciences

    NASA Astrophysics Data System (ADS)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  8. Rural-urban migration including formal and informal workers in the urban sector: an agent-based numerical simulation study

    NASA Astrophysics Data System (ADS)

    Branco, Nilton; Oliveira, Tharnier; Silveira, Jaylson

    2012-02-01

    The goal of this work is to study rural-urban migration in the early stages of industrialization. We use an agent-based model and take into account the existence of informal and formal workers on the urban sector and possible migration movements, dependent on the agents' social and private utilities. Our agents are place on vertices of a square lattice, such that each vertex has only one agent. Rural, urban informal and urban formal workers are represented by different states of a three-state Ising model. At every step, a fraction a of the agents may change sectors or migrate. The total utility of a given agent is then calculated and compared to a random utility, in order to check if this agent turns into an actual migrant or changes sector. The dynamics is carried out until an equilibrium state is reached and equilibrium variables are then calculated and compared to available data. We find that a generalized Harris-Todaro condition is satisfied [1] on these equilibrium regimes, i.e, the ratio between expected wages between any pair of sectors reach a constant value. [4pt] [1] J. J. Silveira, A. L. Esp'indola and T. J. Penna, Physica A, 364, 445 (2006).

  9. Computational Scientific Inquiry with Virtual Worlds and Agent-Based Models: New Ways of Doing Science to Learn Science

    ERIC Educational Resources Information Center

    Jacobson, Michael J.; Taylor, Charlotte E.; Richards, Deborah

    2016-01-01

    In this paper, we propose computational scientific inquiry (CSI) as an innovative model for learning important scientific knowledge and new practices for "doing" science. This approach involves the use of a "game-like" virtual world for students to experience virtual biological fieldwork in conjunction with using an agent-based…

  10. Autonomous Agent-Based Simulation of an AEGIS Cruiser Combat Information Center Performing Battle Group Air Defense Commander Operations

    DTIC Science & Technology

    2003-03-01

    objects from O. • Operations with the task of representing the application of these operations and the reaction of the world to this attempt at... clickable ) (4) Simulation Interface: Tactical Display Contact Icons. • Contact Attributes (specific to the contact) 52 (5...Icon (6) Simulation Interface: CIC Agent Display • CIC Agent Icons ( clickable ) • CIC Equipment Icons ( clickable ) (7) Simulation Interface

  11. Adding ecosystem function to agent-based land use models

    PubMed Central

    Yadav, V.; Del Grosso, S.J.; Parton, W.J.; Malanson, G.P.

    2015-01-01

    The objective of this paper is to examine issues in the inclusion of simulations of ecosystem functions in agent-based models of land use decision-making. The reasons for incorporating these simulations include local interests in land fertility and global interests in carbon sequestration. Biogeochemical models are needed in order to calculate such fluxes. The Century model is described with particular attention to the land use choices that it can encompass. When Century is applied to a land use problem the combinatorial choices lead to a potentially unmanageable number of simulation runs. Century is also parameter-intensive. Three ways of including Century output in agent-based models, ranging from separately calculated look-up tables to agents running Century within the simulation, are presented. The latter may be most efficient, but it moves the computing costs to where they are most problematic. Concern for computing costs should not be a roadblock. PMID:26191077

  12. Ideal free distribution or dynamic game? An agent-based simulation study of trawling strategies with varying information

    NASA Astrophysics Data System (ADS)

    Beecham, J. A.; Engelhard, G. H.

    2007-10-01

    An ecological economic model of trawling is presented to demonstrate the effect of trawling location choice strategy on net input (rate of economic gain of fish caught per time spent less costs). Fishing location choice is considered to be a dynamic process whereby trawlers chose from among a repertoire of plastic strategies that they modify if their gains fall below a fixed proportion of the mean gains of the fleet as a whole. The distribution of fishing across different areas of a fishery follows an approximate ideal free distribution (IFD) with varying noise due to uncertainty. The least-productive areas are not utilised because initial net input never reaches the mean yield of better areas subject to competitive exploitation. In cases, where there is a weak temporal autocorrelation between fish stocks in a specific location, a plastic strategy of local translocation between trawls mixed with longer-range translocation increases realised input. The trawler can change its translocation strategy in the light of information about recent trawling success compared to its long-term average but, in contrast to predictions of the Marginal Value Theorem (MVT) model, does not know for certain what it will find by moving, so may need to sample new patches. The combination of the two types of translocation mirrored beam-trawling strategies used by the Dutch fleet and the resultant distribution of trawling effort is confirmed by analysis of historical effort distribution of British otter trawling fleets in the North Sea. Fisheries exploitation represents an area where dynamic agent-based adaptive models may be a better representation of the economic dynamics of a fleet than classically inspired optimisation models.

  13. Computer simulation of microstructure

    NASA Astrophysics Data System (ADS)

    Xu, Ping; Morris, J. W.

    1992-11-01

    The microstructure that results from a martensitic transformation is largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, it is useful to have computer simulation models that mimic the process. One such model is a finite-element model in which the transforming body is divided into elementary cells that transform when it is energetically favorable to do so. Using the linear elastic theory, the elastic energy of an arbitrary distribution of transformed cells can be calculated, and the elastic strain field can be monitored as the transformation proceeds. In the present article, a model of this type is developed and evaluated by testing its ability to generate the preferred configurations of isolated martensite particles, which can be predicted analytically from the linear elastic theory. Both two- and three-dimensional versions of the model are used. The computer model is in good agreement with analytic theory when the latter predicts single-variant martensite particles. The three-dimensional model also generates twinned martensite in reason- able agreement with the analytic predictions when the fractions of the two variants in the particle are near 0.5. It is less successful in reproducing twinned martensites when one variant is dom- inant; however, in this case, it does produce unusual morphologies, such as “butterfly mar- tensite,” that have been observed experimentally. Neither the analytic theory nor the computer simulation predicts twinned martensites in the two-dimensional transformations considered here, revealing an inherent limitation of studies that are restricted to two dimensions.

  14. [The assessment of the action of pharmacological agents based on a new computer program for analysing animal operant behavior].

    PubMed

    Garibova, T L; Voronina, T A; Stefankov, D V; Kalinina, T S

    1990-01-01

    The new programme is developed for the experimental automatization and analysis of animal operant behavior in Skinner's box (Lafayette Instrument Co., USA) by means of the Apple 2e computer (USA). The fundamental of the programme is the division of the training procedure into different functional intervals. Operant behavior of rats is determined by diverse schedules of food and water reinforcement and electric shock. Rats were trained to response on schedules FR 20, FI 1, drug discrimination. Phenazepam (2 mg/kg) markedly decreases the number of responses on schedule FR 20. Phenazepam is a discriminable stimulus. The experimental results make it possible to use the programme for modelling various forms of operant behavior and analysing pharmacological properties of the well-known and new drugs.

  15. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  16. Learning to Measure Biodiversity: Two Agent-Based Models that Simulate Sampling Methods & Provide Data for Calculating Diversity Indices

    ERIC Educational Resources Information Center

    Jones, Thomas; Laughlin, Thomas

    2009-01-01

    Nothing could be more effective than a wilderness experience to demonstrate the importance of conserving biodiversity. When that is not possible, though, there are computer models with several features that are helpful in understanding how biodiversity is measured. These models are easily used when natural resources, transportation, and time…

  17. Autonomous-Agent Based Simulation of Anti- Submarine Warfare Operations with the Goal of Protecting a High Value Unit

    DTIC Science & Technology

    2004-03-01

    Multi Agent System (MAS) technique. The simulation interface is a Horizontal Display Center (HDC) which is very similar to a MEKQ2OO class Frigate Combat Information Center’s (CIC) HDC. The program uses Extensible Markup Language (XML) files for reading data for program scenarios; parameters are initialized before each run time begins. The simulation also provides all the output data at the end of run time for analysis purposes. The program user’s goal, and the purpose of the program, is to decrease the number of successful attacks against surface

  18. A framework for the use of agent based modeling to simulate inter- and intraindividual variation in human behaviors

    EPA Science Inventory

    Simulation of human behavior in exposure modeling is a complex task. Traditionally, inter-individual variation in human activity has been modeled by drawing from a pool of single day time-activity diaries such as the US EPA Consolidated Human Activity Database (CHAD). Here, an ag...

  19. EPOCHS: A Platform for Agent-Based Electric Power and Communication Simulation Built from Commercial Off-The-Shelf Components

    DTIC Science & Technology

    2005-04-01

    to simplify the development of EMTDC scenarios. PSCAD is produced by the Manitoba HVDC Research Centre [25]. EMTDC simulates power scenarios in...Manitoba HVDC Research Centre, PSCAD/EMTDC Manual Getting Started. Winnipeg, Manitoba, Canada, 1998. [26] General Electric, "PSLF Manual," vol. 2003

  20. Integrating the simulation of domestic water demand behaviour to an urban water model using agent based modelling

    NASA Astrophysics Data System (ADS)

    Koutiva, Ifigeneia; Makropoulos, Christos

    2015-04-01

    The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model

  1. Understanding Islamist political violence through computational social simulation

    SciTech Connect

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G; Eberhardt, Ariane; Stradling, Seth G

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  2. Who's your neighbor? neighbor identification for agent-based modeling.

    SciTech Connect

    Macal, C. M.; Howe, T. R.; Decision and Information Sciences; Univ. of Chicago

    2006-01-01

    Agent-based modeling and simulation, based on the cellular automata paradigm, is an approach to modeling complex systems comprised of interacting autonomous agents. Open questions in agent-based simulation focus on scale-up issues encountered in simulating large numbers of agents. Specifically, how many agents can be included in a workable agent-based simulation? One of the basic tenets of agent-based modeling and simulation is that agents only interact and exchange locally available information with other agents located in their immediate proximity or neighborhood of the space in which the agents are situated. Generally, an agent's set of neighbors changes rapidly as a simulation proceeds through time and as the agents move through space. Depending on the topology defined for agent interactions, proximity may be defined by spatial distance for continuous space, adjacency for grid cells (as in cellular automata), or by connectivity in social networks. Identifying an agent's neighbors is a particularly time-consuming computational task and can dominate the computational effort in a simulation. Two challenges in agent simulation are (1) efficiently representing an agent's neighborhood and the neighbors in it and (2) efficiently identifying an agent's neighbors at any time in the simulation. These problems are addressed differently for different agent interaction topologies. While efficient approaches have been identified for agent neighborhood representation and neighbor identification for agents on a lattice with general neighborhood configurations, other techniques must be used when agents are able to move freely in space. Techniques for the analysis and representation of spatial data are applicable to the agent neighbor identification problem. This paper extends agent neighborhood simulation techniques from the lattice topology to continuous space, specifically R2. Algorithms based on hierarchical (quad trees) or non-hierarchical data structures (grid cells) are

  3. Grid computing and biomolecular simulation.

    PubMed

    Woods, Christopher J; Ng, Muan Hong; Johnston, Steven; Murdock, Stuart E; Wu, Bing; Tai, Kaihsu; Fangohr, Hans; Jeffreys, Paul; Cox, Simon; Frey, Jeremy G; Sansom, Mark S P; Essex, Jonathan W

    2005-08-15

    Biomolecular computer simulations are now widely used not only in an academic setting to understand the fundamental role of molecular dynamics on biological function, but also in the industrial context to assist in drug design. In this paper, two applications of Grid computing to this area will be outlined. The first, involving the coupling of distributed computing resources to dedicated Beowulf clusters, is targeted at simulating protein conformational change using the Replica Exchange methodology. In the second, the rationale and design of a database of biomolecular simulation trajectories is described. Both applications illustrate the increasingly important role modern computational methods are playing in the life sciences.

  4. Use of an agent-based simulation model to evaluate a mobile-based system for supporting emergency evacuation decision making.

    PubMed

    Tian, Yu; Zhou, Tian-Shu; Yao, Qin; Zhang, Mao; Li, Jing-Song

    2014-12-01

    Recently, mass casualty incidents (MCIs) have been occurring frequently and have gained international attention. There is an urgent need for scientifically proven and effective emergency responses to MCIs, particularly as the severity of incidents is continuously increasing. The emergency response to MCIs is a multi-dimensional and multi-participant dynamic process that changes in real-time. The evacuation decisions that assign casualties to different hospitals in a region are very important and impact both the results of emergency treatment and the efficiency of medical resource utilization. Previously, decisions related to casualty evacuation were made by an incident commander with emergency experience and in accordance with macro emergency guidelines. There are few decision-supporting tools available to reduce the difficulty and psychological pressure associated with the evacuation decisions an incident commander must make. In this study, we have designed a mobile-based system to collect medical and temporal data produced during an emergency response to an MCI. Using this information, our system's decision-making model can provide personal evacuation suggestions that improve the overall outcome of an emergency response. The effectiveness of our system in reducing overall mortality has been validated by an agent-based simulation model established to simulate an emergency response to an MCI.

  5. Development of a three-dimensional multiscale agent-based tumor model: simulating gene-protein interaction profiles, cell phenotypes and multicellular patterns in brain cancer.

    PubMed

    Zhang, Le; Athale, Chaitanya A; Deisboeck, Thomas S

    2007-01-07

    Experimental evidence suggests that epidermal growth factor receptor (EGFR)-mediated activation of the signaling protein phospholipase Cgamma plays a critical role in a cancer cell's phenotypic decision to either proliferate or to migrate at a given point in time. Here, we present a novel three-dimensional multiscale agent-based model to simulate this cellular decision process in the context of a virtual brain tumor. Each tumor cell is equipped with an EGFR gene-protein interaction network module that also connects to a simplified cell cycle description. The simulation results show that over time proliferative and migratory cell populations not only oscillate but also directly impact the spatio-temporal expansion patterns of the entire cancer system. The percentage change in the concentration of the sub-cellular interaction network's molecular components fluctuates, and, for the 'proliferation-to-migration' switch we find that the phenotype triggering molecular profile to some degree varies as the tumor system grows and the microenvironment changes. We discuss potential implications of these findings for experimental and clinical cancer research.

  6. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  7. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  8. An agent based model of genotype editing

    SciTech Connect

    Rocha, L. M.; Huang, C. F.

    2004-01-01

    This paper presents our investigation on an agent-based model of Genotype Editing. This model is based on several characteristics that are gleaned from the RNA editing system as observed in several organisms. The incorporation of editing mechanisms in an evolutionary agent-based model provides a means for evolving agents with heterogenous post-transcriptional processes. The study of this agent-based genotype-editing model has shed some light into the evolutionary implications of RNA editing as well as established an advantageous evolutionary computation algorithm for machine learning. We expect that our proposed model may both facilitate determining the evolutionary role of RNA editing in biology, and advance the current state of research in agent-based optimization.

  9. Agent-based models in translational systems biology

    PubMed Central

    An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram

    2013-01-01

    Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989

  10. Computational Modeling and Simulation of Genital Tubercle ...

    EPA Pesticide Factsheets

    Hypospadias is a developmental defect of urethral tube closure that has a complex etiology. Here, we describe a multicellular agent-based model of genital tubercle development that simulates urethrogenesis from the urethral plate stage to urethral tube closure in differentiating male embryos. The model, constructed in CompuCell3D, implemented spatially dynamic signals from SHH, FGF10, and androgen signaling pathways. These signals modulated stochastic cell behaviors, such as differential adhesion, cell motility, proliferation, and apoptosis. Urethral tube closure was an emergent property of the model that was quantitatively dependent on SHH and FGF10 induced effects on mesenchymal proliferation and endodermal apoptosis, ultimately linked to androgen signaling. In the absence of androgenization, simulated genital tubercle development defaulted to the female condition. Intermediate phenotypes associated with partial androgen deficiency resulted in incomplete closure. Using this computer model, complex relationships between urethral tube closure defects and disruption of underlying signaling pathways could be probed theoretically in multiplex disturbance scenarios and modeled into probabilistic predictions for individual risk for hypospadias and potentially other developmental defects of the male genital tubercle. We identify the minimal molecular network that determines the outcome of male genital tubercle development in mice.

  11. Computer simulations of the mouse spermatogenic cycle.

    PubMed

    Ray, Debjit; Pitts, Philip B; Hogarth, Cathryn A; Whitmore, Leanne S; Griswold, Michael D; Ye, Ping

    2014-12-12

    The spermatogenic cycle describes the periodic development of germ cells in the testicular tissue. The temporal-spatial dynamics of the cycle highlight the unique, complex, and interdependent interaction between germ and somatic cells, and are the key to continual sperm production. Although understanding the spermatogenic cycle has important clinical relevance for male fertility and contraception, there are a number of experimental obstacles. For example, the lengthy process cannot be visualized through dynamic imaging, and the precise action of germ cells that leads to the emergence of testicular morphology remains uncharacterized. Here, we report an agent-based model that simulates the mouse spermatogenic cycle on a cross-section of the seminiferous tubule over a time scale of hours to years, while considering feedback regulation, mitotic and meiotic division, differentiation, apoptosis, and movement. The computer model is able to elaborate the germ cell dynamics in a time-lapse movie format, allowing us to trace individual cells as they change state and location. More importantly, the model provides mechanistic understanding of the fundamentals of male fertility, namely how testicular morphology and sperm production are achieved. By manipulating cellular behaviors either individually or collectively in silico, the model predicts causal events for the altered arrangement of germ cells upon genetic or environmental perturbations. This in silico platform can serve as an interactive tool to perform long-term simulation and to identify optimal approaches for infertility treatment and contraceptive development.

  12. Computer Simulation of Mutagenesis.

    ERIC Educational Resources Information Center

    North, J. C.; Dent, M. T.

    1978-01-01

    A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)

  13. Computer Simulations: An Integrating Tool.

    ERIC Educational Resources Information Center

    Bilan, Bohdan J.

    This introduction to computer simulations as an integrated learning experience reports on their use with students in grades 5 through 10 using commercial software packages such as SimCity, SimAnt, SimEarth, and Civilization. Students spent an average of 60 hours with the simulation games and reported their experiences each week in a personal log.…

  14. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  15. Composite Erosion by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    Composite degradation is evaluated by computational simulation when the erosion degradation occurs on a ply-by-ply basis and the degrading medium (device) is normal to the ply. The computational simulation is performed by a multi factor interaction model and by a multi scale and multi physics available computer code. The erosion process degrades both the fiber and the matrix simultaneously in the same slice (ply). Both the fiber volume ratio and the matrix volume ratio approach zero while the void volume ratio increases as the ply degrades. The multi factor interaction model simulates the erosion degradation, provided that the exponents and factor ratios are selected judiciously. Results obtained by the computational composite mechanics show that most composite characterization properties degrade monotonically and approach "zero" as the ply degrades completely.

  16. Agent-Based Computational Modeling of Cell Culture: Understanding Dosimetry In Vitro as Part of In Vitro to In Vivo Extrapolation

    EPA Science Inventory

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assu...

  17. Investigating biocomplexity through the agent-based paradigm

    PubMed Central

    Kaul, Himanshu

    2015-01-01

    Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines—or agents—to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex. PMID:24227161

  18. Computer simulation of astrophysical plasmas

    NASA Technical Reports Server (NTRS)

    Max, Claire E.

    1991-01-01

    The role of sophisticated numerical models and simulations in the field of plasma astrophysics is discussed. The need for an iteration between microphysics and macrophysics in order for astrophysical plasma physics to produce quantitative results that can be related to astronomical data is stressed. A discussion on computational requirements for simulations of astrophysical plasmas contrasts microscopic plasma simulations with macroscopic system models. An overview of particle-in-cell simulations (PICS) is given and two examples of PICS of astrophysical plasma are discussed including particle acceleration by collisionless shocks in relativistic plasmas and magnetic field reconnection in astrophysical plasmas.

  19. Simulating chemistry using quantum computers.

    PubMed

    Kassal, Ivan; Whitfield, James D; Perdomo-Ortiz, Alejandro; Yung, Man-Hong; Aspuru-Guzik, Alán

    2011-01-01

    The difficulty of simulating quantum systems, well known to quantum chemists, prompted the idea of quantum computation. One can avoid the steep scaling associated with the exact simulation of increasingly large quantum systems on conventional computers, by mapping the quantum system to another, more controllable one. In this review, we discuss to what extent the ideas in quantum computation, now a well-established field, have been applied to chemical problems. We describe algorithms that achieve significant advantages for the electronic-structure problem, the simulation of chemical dynamics, protein folding, and other tasks. Although theory is still ahead of experiment, we outline recent advances that have led to the first chemical calculations on small quantum information processors.

  20. Computer Simulation Of Cyclic Oxidation

    NASA Technical Reports Server (NTRS)

    Probst, H. B.; Lowell, C. E.

    1990-01-01

    Computer model developed to simulate cyclic oxidation of metals. With relatively few input parameters, kinetics of cyclic oxidation simulated for wide variety of temperatures, durations of cycles, and total numbers of cycles. Program written in BASICA and run on any IBM-compatible microcomputer. Used in variety of ways to aid experimental research. In minutes, effects of duration of cycle and/or number of cycles on oxidation kinetics of material surveyed.

  1. Computer Simulation of Diffraction Patterns.

    ERIC Educational Resources Information Center

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  2. Evolutionary game theory using agent-based methods.

    PubMed

    Adami, Christoph; Schossau, Jory; Hintze, Arend

    2016-12-01

    Evolutionary game theory is a successful mathematical framework geared towards understanding the selective pressures that affect the evolution of the strategies of agents engaged in interactions with potential conflicts. While a mathematical treatment of the costs and benefits of decisions can predict the optimal strategy in simple settings, more realistic settings such as finite populations, non-vanishing mutations rates, stochastic decisions, communication between agents, and spatial interactions, require agent-based methods where each agent is modeled as an individual, carries its own genes that determine its decisions, and where the evolutionary outcome can only be ascertained by evolving the population of agents forward in time. While highlighting standard mathematical results, we compare those to agent-based methods that can go beyond the limitations of equations and simulate the complexity of heterogeneous populations and an ever-changing set of interactors. We conclude that agent-based methods can predict evolutionary outcomes where purely mathematical treatments cannot tread (for example in the weak selection-strong mutation limit), but that mathematics is crucial to validate the computational simulations.

  3. Parallel Computing for Brain Simulation.

    PubMed

    Pastur-Romay, L A; Porto-Pazos, A B; Cedrón, F; Pazos, A

    2016-11-04

    The human brain is the most complex system in the known universe, but it is the most unknown system. It allows the human beings to possess extraordinary capacities. However, we don´t understand yet how and why most of these capacities are produced. For decades, it have been tried that the computers reproduces these capacities. On one hand, to help understanding the nervous system. On the other hand, to process the data in a more efficient way than before. It is intended to make the computers process the information like the brain does it. The important technological developments and the big multidisciplinary projects have allowed create the first simulation with a number of neurons similar to the human brain neurons number. This paper presents an update review about the main research projects that are trying of simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the actual applications of these works and also the future trends. We have reviewed some works that look for a step forward in Neuroscience and other ones that look for a breakthrough in Computer Science (neuromorphic hardware, machine learning techniques). We summarize the most outstanding characteristics of them and present the latest advances and future plans. In addition, this review remarks the importance of considering not only neurons: the computational models of the brain should include glial cells, given the proven importance of the astrocytes in the information processing.

  4. Computer simulation of martensitic transformations

    SciTech Connect

    Xu, Ping

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  5. Biomes computed from simulated climatologies

    NASA Astrophysics Data System (ADS)

    Claussen, Martin; Esch, Monika

    1994-01-01

    The biome model of Prentice et al. (1992a) is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fur Meteorologie. This study is undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to failures in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are seen for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced C02 concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting changes in vegetation patterns due to a rapid climate change, the latter simulation has to be taken as a prediction of changes in conditions favourable for the existence of certain biomes, not as a prediction of a future distribution of biomes.[/ab

  6. Biomes computed from simulated climatologies

    SciTech Connect

    Claussen, M.; Esch, M.

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  7. Inversion based on computational simulations

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  8. Computer simulation in mechanical spectroscopy

    NASA Astrophysics Data System (ADS)

    Blanter, M. S.

    2012-09-01

    Several examples are given for use of computer simulation in mechanical spectroscopy. On one hand simulation makes it possible to study relaxation mechanisms, and on the other hand to use the colossal accumulation of experimental material to study metals and alloys. The following examples are considered: the effect of Al atom ordering on the Snoek carbon peak in alloys of the system Fe - Al - C; the effect of plastic strain on Finkel'shtein - Rozin relaxation in Fe - Ni - C austenitic steel; checking the adequacy of energy interactions of interstitial atoms, calculated on the basis of a first-principle model by simulation of the concentration dependence of Snoek relaxation parameters in Nb - O.

  9. Agent-Based Modeling of Cancer Stem Cell Driven Solid Tumor Growth.

    PubMed

    Poleszczuk, Jan; Macklin, Paul; Enderling, Heiko

    2016-01-01

    Computational modeling of tumor growth has become an invaluable tool to simulate complex cell-cell interactions and emerging population-level dynamics. Agent-based models are commonly used to describe the behavior and interaction of individual cells in different environments. Behavioral rules can be informed and calibrated by in vitro assays, and emerging population-level dynamics may be validated with both in vitro and in vivo experiments. Here, we describe the design and implementation of a lattice-based agent-based model of cancer stem cell driven tumor growth.

  10. An Agent-Based Cockpit Task Management System

    NASA Technical Reports Server (NTRS)

    Funk, Ken

    1997-01-01

    An agent-based program to facilitate Cockpit Task Management (CTM) in commercial transport aircraft is developed and evaluated. The agent-based program called the AgendaManager (AMgr) is described and evaluated in a part-task simulator study using airline pilots.

  11. Displaying Computer Simulations Of Physical Phenomena

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1991-01-01

    Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.

  12. SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling.

    PubMed

    Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi

    2010-01-01

    Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster.

  13. Agent-Based Modeling in Systems Pharmacology.

    PubMed

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling.

  14. Computer Simulation of Martensitic Transformations.

    NASA Astrophysics Data System (ADS)

    Rifkin, Jonathan A.

    This investigation attempted to determine the mechanism of martensitic nucleation by employing computer molecular dynamics; simulations were conducted of various lattices defects to see if they can serve as nucleation sites. As a prerequisite to the simulations the relation between transformation properties and interatomic potential was studied. It was found that the interatomic potential must have specific properties to successfully simulate solid-solid transformations; in particular it needs a long range oscillating tail. We've also studied homogeneous transformations between BCC and FCC structures and concluded it is unlikely that any has a lower energy barrier energy than the Bain transformation. A two dimensional solid was modelled first to gain experience on a relatively simple system; the transformation was from a square lattice to a triangular one. Next a three dimensional system was studied whose interatomic potential was chosen to mimic sodium. Because of the low transition temperature (18K) the transformation from the low temperature phase to high temperature phase was studied (FCC to BCC). The two dimensional system displayed many phenomena characteristic of real martensitic systems: defects promoted nucleation, the martensite grew in plates, some plates served to nucleate new plates (autocatalytic nucleation) and some defects gave rise to multiple plates (butterfly martensite). The three dimensional system did not undergo a permanent martensitic transformation but it did show signs of temporary transformations where some martensite formed and then dissipated. This happened following the dissociation of a screw dislocation into two partial dislocations.

  15. Agent-based computational model of the prevalence of gonococcal infections after the implementation of HIV pre-exposure prophylaxis guidelines

    PubMed Central

    Escobar, Erik; Durgham, Ryan; Dammann, Olaf; Stopka, Thomas J.

    2015-01-01

    Recently, the first comprehensive guidelines were published for pre-exposure prophylaxis (PrEP) for the prevention of HIV infection in populations with substantial risk of infection. Guidelines include a daily regimen of emtricitabine/tenofovir disoproxil fumarate (TDF/FTC) as well as condom usage during sexual activity. The relationship between the TDF/FTC intake regimen and condom usage is not yet fully understood. If men who have sex with men (MSM,) engage in high-risk sexual activities without using condoms when prescribed TDF/FTC they might be at an increased risk for other sexually transmitted diseases (STD). Our study focuses on the possible occurrence of behavioral changes among MSM in the United States over time with regard to condom usage. In particular, we were interested in creating a model of how increased uptake of TDF/FTC might cause a decline in condom usage, causing significant increases in non-HIV STD incidence, using gonococcal infection incidence as a biological endpoint. We used the agent-based modeling software NetLogo, building upon an existing model of HIV infection. We found no significant evidence for increased gonorrhea prevalence due to increased PrEP usage at any level of sample-wide usage, with a range of 0-90% PrEP usage. However, we did find significant evidence for decreased prevalence of HIV, with a maximal effect being reached when 5% to 10% of the MSM population used PrEP. Our findings appear to indicate that attitudes of aversion, within the medical community, toward the promotion of PrEP due to the potential risk of increased STD transmission are unfounded. PMID:26834937

  16. Priority Queues for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

  17. Computer Simulation of Radial Immunodiffusion

    PubMed Central

    Trautman, Rodes

    1972-01-01

    Theories of diffusion with chemical reaction are reviewed as to their contributions toward developing an algorithm needed for computer simulation of immunodiffusion. The Spiers-Augustin moving sink and the Engelberg stationary sink theories show how the antibody-antigen reaction can be incorporated into boundary conditions of the free diffusion differential equations. For this, a stoichiometric precipitate was assumed and the location of precipitin lines could be predicted. The Hill simultaneous linear adsorption theory provides a mathematical device for including another special type of antibody-antigen reaction in antigen excess regions of the gel. It permits an explanation for the lowered antigen diffusion coefficient, observed in the Oudin arrangement of single linear diffusion, but does not enable prediction of the location of precipitin lines. The most promising mathematical approach for a general solution is implied in the Augustin alternating cycle theory. This assumes the immunodiffusion process can be evaluated by alternating computation cycles: free diffusion without chemical reaction and chemical reaction without diffusion. The algorithm for the free diffusion update cycle, extended to both linear and radial geometries, is given in detail since it was based on gross flow rather than more conventional expressions in terms of net flow. Limitations on the numerical integration process using this algorithm are illustrated for free diffusion from a cylindrical well. PMID:4629869

  18. Computer simulation studies of minerals

    NASA Astrophysics Data System (ADS)

    Oganov, Artem Romaevich

    Applications of state-of-the-art computer simulations to important Earth- and rock-forming minerals (Al2SiO5 polymorphs, albite (NaAlSi3O8), and MgSiO3 perovskite) are described. Detailed introductions to equations of state and elasticity, phase transitions, computer simulations, and geophysical background are given. A new general classification of phase transitions is proposed, providing a natural framework for discussion of structural, thermodynamic, and kinetic aspects of phase transitions. The concept of critical bond distances is introduced. For Si-O bonds this critical distance is 2.25 A. Using atomistic simulations, anomalous Al-Si antiordering in albite is explained. A first-order isosymmetric transition associated with a change in the ordering scheme is predicted at high pressures. A quantum-mechanical study is presented for the Al2SiO5 polymorphs: kyanite, andalusite, sillimanite, and hypothetical pseudobrookite-like and V3O5-like phases (the latter phase was believed to be the main Al mineral of the lower mantle). It is shown that above 11 GPa all the Al2SiO5 phases break down into the mixture of oxides: corundum (Al2O3) and stishovite (SiO2). Atomisation energies, crystal structures and equations of state of all the Al2SiO5 polymorphs, corundum, stishovite, quartz (SiO2) have been determined. Metastable pressure-induced transitions in sillimanite and andalusite are predicted at ~30-50 GPa and analysed in terms of structural changes and lattice dynamics. Sillimanite (Pbnm) transforms into incommensurate and isosymmetric (Pbnm) phases; andalusite undergoes pressure-induced amorphisation. Accurate quantum-mechanical thermal equation of state is obtained for MgSiO3 perovskite, the main Earth-forming mineral. Results imply that a pure-perovskite mantle is unlikely. I show that MgSiO3 perovskite is not a Debye-like solid, contrary to a common assumption. First ever ab initio molecular dynamics calculations of elastic constants at finite temperatures are

  19. Computer-Graphical Simulation Of Robotic Welding

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken; Cook, George

    1988-01-01

    Computer program ROBOSIM, developed to simulate operations of robots, applied to preliminary design of robotic arc-welding operation. Limitations on equipment investigated in advance to prevent expensive mistakes. Computer makes drawing of robotic welder and workpiece on positioning table. Such numerical simulation used to perform rapid, safe experiments in computer-aided design or manufacturing.

  20. Agent-based model to rural urban migration analysis

    NASA Astrophysics Data System (ADS)

    Silveira, Jaylson J.; Espíndola, Aquino L.; Penna, T. J. P.

    2006-05-01

    In this paper, we analyze the rural-urban migration phenomenon as it is usually observed in economies which are in the early stages of industrialization. The analysis is conducted by means of a statistical mechanics approach which builds a computational agent-based model. Agents are placed on a lattice and the connections among them are described via an Ising-like model. Simulations on this computational model show some emergent properties that are common in developing economies, such as a transitional dynamics characterized by continuous growth of urban population, followed by the equalization of expected wages between rural and urban sectors (Harris-Todaro equilibrium condition), urban concentration and increasing of per capita income.

  1. Proceedings of the 1991 summer computer simulation conference

    SciTech Connect

    Pace, D.

    1991-01-01

    This book covers the following topics in computer simulation: validation, languages, algorithms, computer performance and advanced processing, intelligent simulation, simulations in power and propulsion systems, and biomedical simulations.

  2. A Large Scale, High Resolution Agent-Based Insurgency Model

    DTIC Science & Technology

    2013-09-30

    for understanding and analyzing human behavior in a civil violence paradigm. This model employed two types of agents: an agent that can become...cognitions and behaviors. Unlike previous agent-based models of civil violence , this work includes the use of a hidden Markov process for simulating...these models can portray real insurgent environments. Keywords simulation · agent based model · insurgency · civil violence · graphics processing

  3. An agent-based microsimulation of critical infrastructure systems

    SciTech Connect

    BARTON,DIANNE C.; STAMBER,KEVIN L.

    2000-03-29

    US infrastructures provide essential services that support the economic prosperity and quality of life. Today, the latest threat to these infrastructures is the increasing complexity and interconnectedness of the system. On balance, added connectivity will improve economic efficiency; however, increased coupling could also result in situations where a disturbance in an isolated infrastructure unexpectedly cascades across diverse infrastructures. An understanding of the behavior of complex systems can be critical to understanding and predicting infrastructure responses to unexpected perturbation. Sandia National Laboratories has developed an agent-based model of critical US infrastructures using time-dependent Monte Carlo methods and a genetic algorithm learning classifier system to control decision making. The model is currently under development and contains agents that represent the several areas within the interconnected infrastructures, including electric power and fuel supply. Previous work shows that agent-based simulations models have the potential to improve the accuracy of complex system forecasting and to provide new insights into the factors that are the primary drivers of emergent behaviors in interdependent systems. Simulation results can be examined both computationally and analytically, offering new ways of theorizing about the impact of perturbations to an infrastructure network.

  4. The Shuttle Mission Simulator computer generated imagery

    NASA Technical Reports Server (NTRS)

    Henderson, T. H.

    1984-01-01

    Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

  5. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  6. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  7. Simulating Drosophila Genetics with the Computer.

    ERIC Educational Resources Information Center

    Small, James W., Jr.; Edwards, Kathryn L.

    1979-01-01

    Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

  8. Agent based simulations in disease modeling Comment on "Towards a unified approach in the modeling of fibrosis: A review with research perspectives" by Martine Ben Amar and Carlo Bianca

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Pennisi, Marzio

    2016-07-01

    Fibrosis represents a process where an excessive tissue formation in an organ follows the failure of a physiological reparative or reactive process. Mathematical and computational techniques may be used to improve the understanding of the mechanisms that lead to the disease and to test potential new treatments that may directly or indirectly have positive effects against fibrosis [1]. In this scenario, Ben Amar and Bianca [2] give us a broad picture of the existing mathematical and computational tools that have been used to model fibrotic processes at the molecular, cellular, and tissue levels. Among such techniques, agent based models (ABM) can give a valuable contribution in the understanding and better management of fibrotic diseases.

  9. Monte Carlo Computer Simulation of a Rainbow.

    ERIC Educational Resources Information Center

    Olson, Donald; And Others

    1990-01-01

    Discusses making a computer-simulated rainbow using principles of physics, such as reflection and refraction. Provides BASIC program for the simulation. Appends a program illustrating the effects of dispersion of the colors. (YP)

  10. Using Model Replication to Improve the Reliability of Agent-Based Models

    NASA Astrophysics Data System (ADS)

    Zhong, Wei; Kim, Yushim

    The basic presupposition of model replication activities for a computational model such as an agent-based model (ABM) is that, as a robust and reliable tool, it must be replicable in other computing settings. This assumption has recently gained attention in the community of artificial society and simulation due to the challenges of model verification and validation. Illustrating the replication of an ABM representing fraudulent behavior in a public service delivery system originally developed in the Java-based MASON toolkit for NetLogo by a different author, this paper exemplifies how model replication exercises provide unique opportunities for model verification and validation process. At the same time, it helps accumulate best practices and patterns of model replication and contributes to the agenda of developing a standard methodological protocol for agent-based social simulation.

  11. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  12. Study of photo-oxidative reactivity of sunscreening agents based on photo-oxidation of uric acid by kinetic Monte Carlo simulation.

    PubMed

    Moradmand Jalali, Hamed; Bashiri, Hadis; Rasa, Hossein

    2015-05-01

    In the present study, the mechanism of free radical production by light-reflective agents in sunscreens (TiO2, ZnO and ZrO2) was obtained by applying kinetic Monte Carlo simulation. The values of the rate constants for each step of the suggested mechanism have been obtained by simulation. The effect of the initial concentration of mineral oxides and uric acid on the rate of uric acid photo-oxidation by irradiation of some sun care agents has been studied. The kinetic Monte Carlo simulation results agree qualitatively with the existing experimental data for the production of free radicals by sun care agents.

  13. Agent-based forward analysis

    SciTech Connect

    Kerekes, Ryan A.; Jiao, Yu; Shankar, Mallikarjun; Potok, Thomas E.; Lusk, Rick M.

    2008-01-01

    We propose software agent-based "forward analysis" for efficient information retrieval in a network of sensing devices. In our approach, processing is pushed to the data at the edge of the network via intelligent software agents rather than pulling data to a central facility for processing. The agents are deployed with a specific query and perform varying levels of analysis of the data, communicating with each other and sending only relevant information back across the network. We demonstrate our concept in the context of face recognition using a wireless test bed comprised of PDA cell phones and laptops. We show that agent-based forward analysis can provide a significant increase in retrieval speed while decreasing bandwidth usage and information overload at the central facility. n

  14. Computer Simulation of Colliding Galaxies

    NASA Video Gallery

    Simulation of the formation of the galaxy known as "The Mice." The simulation depicts the merger of two spiral galaxies, pausing and rotating at the stage resembling the Hubble Space Telescope Adva...

  15. Computer Simulation in Undergraduate Instruction: A Symposium.

    ERIC Educational Resources Information Center

    Street, Warren R.; And Others

    These symposium papers discuss the instructional use of computers in psychology, with emphasis on computer-produced simulations. The first, by Rich Edwards, briefly outlines LABSIM, a general purpose system of FORTRAN programs which simulate data collection in more than a dozen experimental models in psychology and are designed to train students…

  16. Computationally Lightweight Air-Traffic-Control Simulation

    NASA Technical Reports Server (NTRS)

    Knight, Russell

    2005-01-01

    An algorithm for computationally lightweight simulation of automated air traffic control (ATC) at a busy airport has been derived. The algorithm is expected to serve as the basis for development of software that would be incorporated into flight-simulator software, the ATC component of which is not yet capable of handling realistic airport loads. Software based on this algorithm could also be incorporated into other computer programs that simulate a variety of scenarios for purposes of training or amusement.

  17. Computer Simulation of the Neuronal Action Potential.

    ERIC Educational Resources Information Center

    Solomon, Paul R.; And Others

    1988-01-01

    A series of computer simulations of the neuronal resting and action potentials are described. Discusses the use of simulations to overcome the difficulties of traditional instruction, such as blackboard illustration, which can only illustrate these events at one point in time. Describes systems requirements necessary to run the simulations.…

  18. Computer Clinical Simulations in Health Sciences.

    ERIC Educational Resources Information Center

    Jones, Gary L; Keith, Kenneth D.

    1983-01-01

    Discusses the key characteristics of clinical simulation, some developmental foundations, two current research studies, and some implications for the future of health science education. Investigations of the effects of computer-based simulation indicate that acquisition of decision-making skills is greater than with noncomputerized simulations.…

  19. FEL Simulation Using Distributed Computing

    SciTech Connect

    Einstein, Joshua; Bernabeu Altayo, Gerard; Biedron, Sandra; Freund, Henry; Milton, Stephen; van der Slot, Peter

    2016-06-01

    While simulation tools are available and have been used regularly for simulating light sources, the increasing availability and lower cost of GPU-based processing opens up new opportunities. This poster highlights a method of how accelerating and parallelizing code processing through the use of COTS software interfaces.

  20. Filtration theory using computer simulations

    SciTech Connect

    Bergman, W.; Corey, I.

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  1. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  2. Evaluation of Visual Computer Simulator for Computer Architecture Education

    ERIC Educational Resources Information Center

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  3. Fluctuation complexity of agent-based financial time series model by stochastic Potts system

    NASA Astrophysics Data System (ADS)

    Hong, Weijia; Wang, Jun

    2015-03-01

    Financial market is a complex evolved dynamic system with high volatilities and noises, and the modeling and analyzing of financial time series are regarded as the rather challenging tasks in financial research. In this work, by applying the Potts dynamic system, a random agent-based financial time series model is developed in an attempt to uncover the empirical laws in finance, where the Potts model is introduced to imitate the trading interactions among the investing agents. Based on the computer simulation in conjunction with the statistical analysis and the nonlinear analysis, we present numerical research to investigate the fluctuation behaviors of the proposed time series model. Furthermore, in order to get a robust conclusion, we consider the daily returns of Shanghai Composite Index and Shenzhen Component Index, and the comparison analysis of return behaviors between the simulation data and the actual data is exhibited.

  4. Computational Spectrum of Agent Model Simulation

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

  5. Addressing the translational dilemma: dynamic knowledge representation of inflammation using agent-based modeling.

    PubMed

    An, Gary; Christley, Scott

    2012-01-01

    Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge

  6. Computational Modeling and Simulation of Genital Tubercle Development

    EPA Science Inventory

    Hypospadias is a developmental defect of urethral tube closure that has a complex etiology. Here, we describe a multicellular agent-based model of genital tubercle development that simulates urethrogenesis from the urethral plate stage to urethral tube closure in differentiating ...

  7. Computer simulation of upset welding

    SciTech Connect

    Spingarn, J R; Mason, W E; Swearengen, J C

    1982-04-01

    Useful process modeling of upset welding requires contributions from metallurgy, welding engineering, thermal analysis and experimental mechanics. In this report, the significant milestones for such an effort are outlined and probable difficult areas are pointed out. Progress to date is summarized and directions for future research are offered. With regard to the computational aspects of this problem, a 2-D heat conduction computer code has been modified to incorporate electrical heating, and computations have been run for an axisymmetric problem with simple viscous material laws and d.c. electrical boundary conditions. In the experimental endeavor, the boundary conditions have been measured during the welding process, although interpretation of voltage drop measurements is not straightforward. The ranges of strain, strain rate and temperature encountered during upset welding have been measured or calculated, and the need for a unifying constitutive law is described. Finally, the possible complications of microstructure and interfaces are clarified.

  8. Teaching by Simulation with Personal Computers.

    ERIC Educational Resources Information Center

    Randall, James E.

    1978-01-01

    Describes the use of a small digital computer to simulate a peripheral nerve demonstration in which the action potential responses to pairs of stimuli are used to illustrate the properties of excitable membranes. (Author/MA)

  9. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  10. Computer Simulation of Community Mental Health Centers.

    ERIC Educational Resources Information Center

    Cox, Gary B.; And Others

    1985-01-01

    Describes an ongoing research project designed to develop a computer model capable of simulating the service delivery activities of community mental health care centers and human service agencies. The goal and methodology of the project are described. (NB)

  11. Computer Simulation of NMR Spectra.

    ERIC Educational Resources Information Center

    Ellison, A.

    1983-01-01

    Describes a PASCAL computer program which provides interactive analysis and display of high-resolution nuclear magnetic resonance (NMR) spectra from spin one-half nuclei using a hard-copy or monitor. Includes general and theoretical program descriptions, program capability, and examples of its use. (Source for program/documentation is included.)…

  12. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data.

    PubMed

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data.

  13. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data

    PubMed Central

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589

  14. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  15. Computer-simulated phacoemulsification improvements

    NASA Astrophysics Data System (ADS)

    Soederberg, Per G.; Laurell, Carl-Gustaf; Artzen, D.; Nordh, Leif; Skarman, Eva; Nordqvist, P.; Andersson, Mats

    2002-06-01

    A simulator for phacoemulsification cataract extraction is developed. A three-dimensional visual interface and foot pedals for phacoemulsification power, x-y positioning, zoom and focus were established. An algorithm that allows real time visual feedback of the surgical field was developed. Cataract surgery is the most common surgical procedure. The operation requires input from both feet and both hands and provides visual feedback through the operation microscope essentially without tactile feedback. Experience demonstrates that the number of complications for an experienced surgeon learning phacoemulsification, decreases exponentially, reaching close to the asymptote after the first 500 procedures despite initial wet lab training on animal eyes. Simulator training is anticipated to decrease training time, decrease complication rate for the beginner and reduce expensive supervision by a high volume surgeon.

  16. [Animal experimentation, computer simulation and surgical research].

    PubMed

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  17. Criterion Standards for Evaluating Computer Simulation Courseware.

    ERIC Educational Resources Information Center

    Wholeben, Brent Edward

    This paper explores the role of computerized simulations as a decision-modeling intervention strategy, and views the strategy's different attribute biases based upon the varying primary missions of instruction versus application. The common goals associated with computer simulations as a training technique are discussed and compared with goals of…

  18. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  19. Salesperson Ethics: An Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  20. Computer Simulation Of A Small Turboshaft Engine

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1991-01-01

    Component-type mathematical model of small turboshaft engine developed for use in real-time computer simulations of dynamics of helicopter flight. Yields shaft speeds, torques, fuel-consumption rates, and other operating parameters with sufficient accuracy for use in real-time simulation of maneuvers involving large transients in power and/or severe accelerations.

  1. Computer simulation of gear tooth manufacturing processes

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  2. Deterministic Agent-Based Path Optimization by Mimicking the Spreading of Ripples.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Di Paolo, Ezequiel A; Liu, Hao

    2016-01-01

    Inspirations from nature have contributed fundamentally to the development of evolutionary computation. Learning from the natural ripple-spreading phenomenon, this article proposes a novel ripple-spreading algorithm (RSA) for the path optimization problem (POP). In nature, a ripple spreads at a constant speed in all directions, and the node closest to the source is the first to be reached. This very simple principle forms the foundation of the proposed RSA. In contrast to most deterministic top-down centralized path optimization methods, such as Dijkstra's algorithm, the RSA is a bottom-up decentralized agent-based simulation model. Moreover, it is distinguished from other agent-based algorithms, such as genetic algorithms and ant colony optimization, by being a deterministic method that can always guarantee the global optimal solution with very good scalability. Here, the RSA is specifically applied to four different POPs. The comparative simulation results illustrate the advantages of the RSA in terms of effectiveness and efficiency. Thanks to the agent-based and deterministic features, the RSA opens new opportunities to attack some problems, such as calculating the exact complete Pareto front in multiobjective optimization and determining the kth shortest project time in project management, which are very difficult, if not impossible, for existing methods to resolve. The ripple-spreading optimization principle and the new distinguishing features and capacities of the RSA enrich the theoretical foundations of evolutionary computation.

  3. Cluster computing software for GATE simulations.

    PubMed

    De Beenhouwer, Jan; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R

    2007-06-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  4. Polymer Composites Corrosive Degradation: A Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  5. Computer Code for Nanostructure Simulation

    NASA Technical Reports Server (NTRS)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  6. Computer simulation of bubble formation.

    SciTech Connect

    Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.; Mathematics and Computer Science; Institute for High Energy Densities of Joint Institute for High Temperatures of RAS

    2007-01-01

    Properties of liquid metals (Li, Pb, Na) containing nanoscale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory.

  7. Creating science simulations through Computational Thinking Patterns

    NASA Astrophysics Data System (ADS)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  8. Computer simulation of space station computer steered high gain antenna

    NASA Technical Reports Server (NTRS)

    Beach, S. W.

    1973-01-01

    The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

  9. COMPARISON OF CLASSIFICATION STRATEGIES BY COMPUTER SIMULATION METHODS.

    DTIC Science & Technology

    NAVAL TRAINING, COMPUTER PROGRAMMING), (*NAVAL PERSONNEL, CLASSIFICATION), SELECTION, SIMULATION, CORRELATION TECHNIQUES , PROBABILITY, COSTS, OPTIMIZATION, PERSONNEL MANAGEMENT, DECISION THEORY, COMPUTERS

  10. Flow simulation and high performance computing

    NASA Astrophysics Data System (ADS)

    Tezduyar, T.; Aliabadi, S.; Behr, M.; Johnson, A.; Kalro, V.; Litke, M.

    1996-10-01

    Flow simulation is a computational tool for exploring science and technology involving flow applications. It can provide cost-effective alternatives or complements to laboratory experiments, field tests and prototyping. Flow simulation relies heavily on high performance computing (HPC). We view HPC as having two major components. One is advanced algorithms capable of accurately simulating complex, real-world problems. The other is advanced computer hardware and networking with sufficient power, memory and bandwidth to execute those simulations. While HPC enables flow simulation, flow simulation motivates development of novel HPC techniques. This paper focuses on demonstrating that flow simulation has come a long way and is being applied to many complex, real-world problems in different fields of engineering and applied sciences, particularly in aerospace engineering and applied fluid mechanics. Flow simulation has come a long way because HPC has come a long way. This paper also provides a brief review of some of the recently-developed HPC methods and tools that has played a major role in bringing flow simulation where it is today. A number of 3D flow simulations are presented in this paper as examples of the level of computational capability reached with recent HPC methods and hardware. These examples are, flow around a fighter aircraft, flow around two trains passing in a tunnel, large ram-air parachutes, flow over hydraulic structures, contaminant dispersion in a model subway station, airflow past an automobile, multiple spheres falling in a liquid-filled tube, and dynamics of a paratrooper jumping from a cargo aircraft.

  11. Numerical characteristics of quantum computer simulation

    NASA Astrophysics Data System (ADS)

    Chernyavskiy, A.; Khamitov, K.; Teplov, A.; Voevodin, V.; Voevodin, Vl.

    2016-12-01

    The simulation of quantum circuits is significantly important for the implementation of quantum information technologies. The main difficulty of such modeling is the exponential growth of dimensionality, thus the usage of modern high-performance parallel computations is relevant. As it is well known, arbitrary quantum computation in circuit model can be done by only single- and two-qubit gates, and we analyze the computational structure and properties of the simulation of such gates. We investigate the fact that the unique properties of quantum nature lead to the computational properties of the considered algorithms: the quantum parallelism make the simulation of quantum gates highly parallel, and on the other hand, quantum entanglement leads to the problem of computational locality during simulation. We use the methodology of the AlgoWiki project (algowiki-project.org) to analyze the algorithm. This methodology consists of theoretical (sequential and parallel complexity, macro structure, and visual informational graph) and experimental (locality and memory access, scalability and more specific dynamic characteristics) parts. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia). We show that the simulation of quantum gates is a good base for the research and testing of the development methods for data intense parallel software, and considered methodology of the analysis can be successfully used for the improvement of the algorithms in quantum information science.

  12. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  13. Research in computer simulation of integrated circuits

    NASA Astrophysics Data System (ADS)

    Newton, A. R.; Pdederson, D. O.

    1983-07-01

    The performance of the new LSI simulator CLASSIE is evaluated on several circuits with a few hundred to over one thousand semiconductor devices. A more accurate run time prediction formula has been found to be appropriate for circuit simulators. The design decisions for optimal performance under the constraints of the hardware (CRAY-1) are presented. Vector computers have an increased potential for fast, accurate simulation at the transistor level of Large-Scale-Integrated Circuits. Design considerations for a new circuit simulator are developed based on the specifics of the vector computer architecture and of LSI circuits. The simulation of Large-Scale-Integrated (LSI) circuits requires very long run time on conventional circuit analysis programs such as SPICE2 and super-mini computers. A new simulator for LSI circuits, CLASSIE, which takes advantage of circuit hierarchy and repetitiveness, and array processors capable of high-speed floating-point computation are a promising combination. While a large number of powerful design verfication tools have been developed for IC design at the transistor and logic gate levels, there are very few silicon-oriented tools for architectural design and evaluation.

  14. Architectural considerations for agent-based national scale policy models : LDRD final report.

    SciTech Connect

    Backus, George A.; Strip, David R.

    2007-09-01

    The need to anticipate the consequences of policy decisions becomes ever more important as the magnitude of the potential consequences grows. The multiplicity of connections between the components of society and the economy makes intuitive assessments extremely unreliable. Agent-based modeling has the potential to be a powerful tool in modeling policy impacts. The direct mapping between agents and elements of society and the economy simplify the mapping of real world functions into the world of computation assessment. Our modeling initiative is motivated by the desire to facilitate informed public debate on alternative policies for how we, as a nation, provide healthcare to our population. We explore the implications of this motivation on the design and implementation of a model. We discuss the choice of an agent-based modeling approach and contrast it to micro-simulation and systems dynamics approaches.

  15. An agent-based model of collective emotions in online communities

    NASA Astrophysics Data System (ADS)

    Schweitzer, F.; Garcia, D.

    2010-10-01

    We develop an agent-based framework to model the emergence of collective emotions, which is applied to online communities. Agent’s individual emotions are described by their valence and arousal. Using the concept of Brownian agents, these variables change according to a stochastic dynamics, which also considers the feedback from online communication. Agents generate emotional information, which is stored and distributed in a field modeling the online medium. This field affects the emotional states of agents in a non-linear manner. We derive conditions for the emergence of collective emotions, observable in a bimodal valence distribution. Dependent on a saturated or a superlinear feedback between the information field and the agent’s arousal, we further identify scenarios where collective emotions only appear once or in a repeated manner. The analytical results are illustrated by agent-based computer simulations. Our framework provides testable hypotheses about the emergence of collective emotions, which can be verified by data from online communities.

  16. Computational Modeling and Simulation of Genital Tubercle Development

    EPA Pesticide Factsheets

    Hypospadias is a developmental defect of urethral tube closure that has a complex etiology involving genetic and environmental factors, including anti-androgenic and estrogenic disrupting chemicals; however, little is known about the morphoregulatory consequences of androgen/estrogen balance during genital tubercle (GT) development. Computer models that predictively model sexual dimorphism of the GT may provide a useful resource to translate chemical-target bipartite networks and their developmental consequences across the human-relevant chemical universe. Here, we describe a multicellular agent-based model of genital tubercle (GT) development that simulates urethrogenesis from the sexually-indifferent urethral plate stage to urethral tube closure. The prototype model, constructed in CompuCell3D, recapitulates key aspects of GT morphogenesis controlled by SHH, FGF10, and androgen pathways through modulation of stochastic cell behaviors, including differential adhesion, motility, proliferation, and apoptosis. Proper urethral tube closure in the model was shown to depend quantitatively on SHH- and FGF10-induced effects on mesenchymal proliferation and epithelial apoptosis??both ultimately linked to androgen signaling. In the absence of androgen, GT development was feminized and with partial androgen deficiency, the model resolved with incomplete urethral tube closure, thereby providing an in silico platform for probabilistic prediction of hypospadias risk across c

  17. Computer Series, 108. Computer Simulation of Chemical Equilibrium.

    ERIC Educational Resources Information Center

    Cullen, John F., Jr.

    1989-01-01

    Presented is a computer simulation called "The Great Chemical Bead Game" which can be used to teach the concepts of equilibrium and kinetics to introductory chemistry students more clearly than through an experiment. Discussed are the rules of the game, the application of rate laws and graphical analysis. (CW)

  18. Enabling Computational Technologies for Terascale Scientific Simulations

    SciTech Connect

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  19. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  20. Simulation methods for advanced scientific computing

    SciTech Connect

    Booth, T.E.; Carlson, J.A.; Forster, R.A.

    1998-11-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of the project was to create effective new algorithms for solving N-body problems by computer simulation. The authors concentrated on developing advanced classical and quantum Monte Carlo techniques. For simulations of phase transitions in classical systems, they produced a framework generalizing the famous Swendsen-Wang cluster algorithms for Ising and Potts models. For spin-glass-like problems, they demonstrated the effectiveness of an extension of the multicanonical method for the two-dimensional, random bond Ising model. For quantum mechanical systems, they generated a new method to compute the ground-state energy of systems of interacting electrons. They also improved methods to compute excited states when the diffusion quantum Monte Carlo method is used and to compute longer time dynamics when the stationary phase quantum Monte Carlo method is used.

  1. Structural Composites Corrosive Management by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  2. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  3. Computer simulation of the threshold sensitivity determinations

    NASA Technical Reports Server (NTRS)

    Gayle, J. B.

    1974-01-01

    A computer simulation study was carried out to evaluate various methods for determining threshold stimulus levels for impact sensitivity tests. In addition, the influence of a number of variables (initial stimulus level, particular stimulus response curve, and increment size) on the apparent threshold values and on the corresponding population response levels was determined. Finally, a critical review of previous assumptions regarding the stimulus response curve for impact testing is presented in the light of the simulation results.

  4. A computer management system for patient simulations.

    PubMed

    Finkelsteine, M W; Johnson, L A; Lilly, G E

    1991-04-01

    A series of interactive videodisc patient simulations is being used to teach clinical problem-solving skills, including diagnosis and management, to dental students. This series is called Oral Disease Simulations for Diagnosis and Management (ODSDM). A computer management system has been developed in response to the following needs. First, the sequence in which students perform simulations is critical. Second, maintaining records of completed simulations and student performance on each simulation is a time-consuming task for faculty. Third, the simulations require ongoing evaluation to ensure high quality instruction. The primary objective of the management system is to ensure that each student masters diagnosis. Mastery must be obtained at a specific level before advancing to the next level. The management system does this by individualizing the sequence of the simulations to adapt to the needs of each student. The management system generates reports which provide information about students or the simulations. Student reports contain demographic and performance information. System reports include information about individual patient simulations and act as a quality control mechanism for the simulations.

  5. Perspective: Computer simulations of long time dynamics

    SciTech Connect

    Elber, Ron

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.

  6. Perspective: Computer simulations of long time dynamics

    PubMed Central

    Elber, Ron

    2016-01-01

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  7. Applications of Agent Based Approaches in Business (A Three Essay Dissertation)

    ERIC Educational Resources Information Center

    Prawesh, Shankar

    2013-01-01

    The goal of this dissertation is to investigate the enabling role that agent based simulation plays in business and policy. The aforementioned issue has been addressed in this dissertation through three distinct, but related essays. The first essay is a literature review of different research applications of agent based simulation in various…

  8. Simulating physical phenomena with a quantum computer

    NASA Astrophysics Data System (ADS)

    Ortiz, Gerardo

    2003-03-01

    In a keynote speech at MIT in 1981 Richard Feynman raised some provocative questions in connection to the exact simulation of physical systems using a special device named a ``quantum computer'' (QC). At the time it was known that deterministic simulations of quantum phenomena in classical computers required a number of resources that scaled exponentially with the number of degrees of freedom, and also that the probabilistic simulation of certain quantum problems were limited by the so-called sign or phase problem, a problem believed to be of exponential complexity. Such a QC was intended to mimick physical processes exactly the same as Nature. Certainly, remarks coming from such an influential figure generated widespread interest in these ideas, and today after 21 years there are still some open questions. What kind of physical phenomena can be simulated with a QC?, How?, and What are its limitations? Addressing and attempting to answer these questions is what this talk is about. Definitively, the goal of physics simulation using controllable quantum systems (``physics imitation'') is to exploit quantum laws to advantage, and thus accomplish efficient imitation. Fundamental is the connection between a quantum computational model and a physical system by transformations of operator algebras. This concept is a necessary one because in Quantum Mechanics each physical system is naturally associated with a language of operators and thus can be considered as a possible model of quantum computation. The remarkable result is that an arbitrary physical system is naturally simulatable by another physical system (or QC) whenever a ``dictionary'' between the two operator algebras exists. I will explain these concepts and address some of Feynman's concerns regarding the simulation of fermionic systems. Finally, I will illustrate the main ideas by imitating simple physical phenomena borrowed from condensed matter physics using quantum algorithms, and present experimental

  9. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  10. Simulation Concept - How to Exploit Tools for Computing Hybrids

    DTIC Science & Technology

    2009-07-01

    multiphysics design tools (Simulation of Biological Systems - SIMBIOSYS ), provide an open source environment for biological simulation tools (Bio...SCHETCH Simulation Concept – How to Exploit Tools for Computing Project SIMBIOSYS Simulation of Biological Systems Program SPICE Simulation

  11. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  12. Designing Online Scaffolds for Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high school…

  13. Making Students Decide: The Vietnam Computer Simulation.

    ERIC Educational Resources Information Center

    O'Reilly, Kevin

    1994-01-01

    Contends that an important goal in history instruction is helping students understand the complexity of events. Describes the use of "Escalation," a commercially available computer simulation, in a high school U.S. history class. Includes excerpts from student journals kept during the activity. (CFR)

  14. Progress in Computational Simulation of Earthquakes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  15. Factors Promoting Engaged Exploration with Computer Simulations

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.

    2010-01-01

    This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…

  16. Macromod: Computer Simulation For Introductory Economics

    ERIC Educational Resources Information Center

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  17. Computer Graphics Simulations of Sampling Distributions.

    ERIC Educational Resources Information Center

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  18. Quantitative computer simulations of extraterrestrial processing operations

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  19. Computer simulations of WIGWAM underwater experiment

    SciTech Connect

    Kamegai, Minao; White, J.W.

    1993-11-01

    We performed computer simulations of the WIGWAM underwater experiment with a 2-D hydro-code, CALE. First, we calculated the bubble pulse and the signal strength at the closest gauge in one-dimensional geometry. The calculation shows excellent agreement with the measured data. Next, we made two-dimensional simulations of WIGWAM applying the gravity over-pressure, and calculated the signals at three selected gauge locations where measurements were recorded. The computed peak pressures at those gauge locations come well within the 15% experimental error bars. The signal at the farthest gauge is of the order of 200 bars. This is significant, because at this pressure the CALE output can be linked to a hydro-acoustics computer program, NPE Code (Nonlinear Progressive Wave-equation Code), to analyze the long distance propagation of acoustical signals from the underwater explosions on a global scale.

  20. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors.

  1. Simulating fermions on a quantum computer

    NASA Astrophysics Data System (ADS)

    Ortiz, G.; Gubernatis, J. E.; Knill, E.; Laflamme, R.

    2002-07-01

    The real-time probabilistic simulation of quantum systems in classical computers is known to be limited by the so-called dynamical sign problem, a problem leading to exponential complexity. In 1981 Richard Feynman raised some provocative questions in connection to the "exact imitation" of such systems using a special device named a "quantum computer". Feynman hesitated about the possibility of imitating fermion systems using such a device. Here we address some of his concerns and, in particular, investigate the simulation of fermionic systems. We show how quantum computers avoid the sign problem in some cases by reducing the complexity from exponential to polynomial. Our demonstration is based upon the use of isomorphisms of algebras. We present specific quantum algorithms that illustrate the main points of our algebraic approach.

  2. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1983-01-01

    Adequate computer methods, based on interactions between discrete particles, provide information leading to an atomic level understanding of various physical processes. The success of these simulation methods, however, is related to the accuracy of the potential energy function representing the interactions among the particles. The development of a potential energy function for crystalline SiO2 forms that can be employed in lengthy computer modelling procedures was investigated. In many of the simulation methods which deal with discrete particles, semiempirical two body potentials were employed to analyze energy and structure related properties of the system. Many body interactions are required for a proper representation of the total energy for many systems. Many body interactions for simulations based on discrete particles are discussed.

  3. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2005-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  4. Cosmological Simulations on a Grid of Computers

    NASA Astrophysics Data System (ADS)

    Depardon, Benjamin; Caron, Eddy; Desprez, Frédéric; Blaizot, Jérémy; Courtois, Hélène

    2010-06-01

    The work presented in this paper aims at restricting the input parameter values of the semi-analytical model used in GALICS and MOMAF, so as to derive which parameters influence the most the results, e.g., star formation, feedback and halo recycling efficiencies, etc. Our approach is to proceed empirically: we run lots of simulations and derive the correct ranges of values. The computation time needed is so large, that we need to run on a grid of computers. Hence, we model GALICS and MOMAF execution time and output files size, and run the simulation using a grid middleware: DIET. All the complexity of accessing resources, scheduling simulations and managing data is harnessed by DIET and hidden behind a web portal accessible to the users.

  5. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2004-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  6. A Harris-Todaro Agent-Based Model to Rural-Urban Migration

    NASA Astrophysics Data System (ADS)

    Espíndola, Aquino L.; Silveira, Jaylson J.; Penna, T. J. P.

    2006-09-01

    The Harris-Todaro model of the rural-urban migration process is revisited under an agent-based approach. The migration of the workers is interpreted as a process of social learning by imitation, formalized by a computational model. By simulating this model, we observe a transitional dynamics with continuous growth of the urban fraction of overall population toward an equilibrium. Such an equilibrium is characterized by stabilization of rural-urban expected wages differential (generalized Harris-Todaro equilibrium condition), urban concentration and urban unemployment. These classic results obtained originally by Harris and Todaro are emergent properties of our model.

  7. Agent-based modeling of host–pathogen systems: The successes and challenges

    PubMed Central

    Bauer, Amy L.; Beauchemin, Catherine A.A.; Perelson, Alan S.

    2009-01-01

    Agent-based models have been employed to describe numerous processes in immunology. Simulations based on these types of models have been used to enhance our understanding of immunology and disease pathology. We review various agent-based models relevant to host–pathogen systems and discuss their contributions to our understanding of biological processes. We then point out some limitations and challenges of agent-based models and encourage efforts towards reproducibility and model validation. PMID:20161146

  8. Computational Challenges in Nuclear Weapons Simulation

    SciTech Connect

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  9. Computer modeling and simulation of human movement.

    PubMed

    Pandy, M G

    2001-01-01

    Recent interest in using modeling and simulation to study movement is driven by the belief that this approach can provide insight into how the nervous system and muscles interact to produce coordinated motion of the body parts. With the computational resources available today, large-scale models of the body can be used to produce realistic simulations of movement that are an order of magnitude more complex than those produced just 10 years ago. This chapter reviews how the structure of the neuromusculoskeletal system is commonly represented in a multijoint model of movement, how modeling may be combined with optimization theory to simulate the dynamics of a motor task, and how model output can be analyzed to describe and explain muscle function. Some results obtained from simulations of jumping, pedaling, and walking are also reviewed to illustrate the approach.

  10. Computer simulations of learning in neural systems.

    PubMed

    Salu, Y

    1983-04-01

    Recent experiments have shown that, in some cases, strengths of synaptic ties are being modified in learning. However, it is not known what the rules that control those modifications are, especially what determines which synapses will be modified and which will remain unchanged during a learning episode. Two postulated rules that may solve that problem are introduced. To check their effectiveness, the rules are tested in many computer models that simulate learning in neural systems. The simulations demonstrate that, theoretically, the two postulated rules are effective in organizing the synaptic changes. If they are found to also exist in biological systems, these postulated rules may be an important element in the learning process.

  11. Weld fracture criteria for computer simulation

    NASA Technical Reports Server (NTRS)

    Jemian, Wartan A.

    1993-01-01

    Due to the complexity of welding, not all of the important factors are always properly considered and controlled. An automatic system is required. This report outlines a simulation method and all the important considerations to do this. As in many situations where a defect or failure has occurrred, it is freqently necessary to trouble shoot the system and eventually identify those factors that were neglected. This is expensive and time consuming. Very frequently the causes are materials-related that might have been anticipated. Computer simulation can automatically consider all important variables. The major goal of this presentation is to identify the proper relationship to design, processing and materials variables to welding.

  12. Unsteady flow simulation on a parallel computer

    NASA Astrophysics Data System (ADS)

    Faden, M.; Pokorny, S.; Engel, K.

    For the simulation of the flow through compressor stages, an interactive flow simulation system is set up on an MIMD-type parallel computer. An explicit scheme is used in order to resolve the time-dependent interaction between the blades. The 2D Navier-Stokes equations are transformed into their general moving coordinates. The parallelization of the solver is based on the idea of domain decomposition. Results are presented for a problem of fixed size (4096 grid nodes for the Hakkinen case).

  13. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  14. Computer Simulation for Emergency Incident Management

    SciTech Connect

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  15. Computational plasticity algorithm for particle dynamics simulations

    NASA Astrophysics Data System (ADS)

    Krabbenhoft, K.; Lyamin, A. V.; Vignes, C.

    2017-03-01

    The problem of particle dynamics simulation is interpreted in the framework of computational plasticity leading to an algorithm which is mathematically indistinguishable from the common implicit scheme widely used in the finite element analysis of elastoplastic boundary value problems. This algorithm provides somewhat of a unification of two particle methods, the discrete element method and the contact dynamics method, which usually are thought of as being quite disparate. In particular, it is shown that the former appears as the special case where the time stepping is explicit while the use of implicit time stepping leads to the kind of schemes usually labelled contact dynamics methods. The framing of particle dynamics simulation within computational plasticity paves the way for new approaches similar (or identical) to those frequently employed in nonlinear finite element analysis. These include mixed implicit-explicit time stepping, dynamic relaxation and domain decomposition schemes.

  16. Understanding Membrane Fouling Mechanisms through Computational Simulations

    NASA Astrophysics Data System (ADS)

    Xiang, Yuan

    This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.

  17. Understanding membrane fouling mechanisms through computational simulations

    NASA Astrophysics Data System (ADS)

    Xiang, Yuan

    This dissertation focuses on a computational simulation study on the organic fouling mechanisms of reverse osmosis and nanofiltration (RO/NF) membranes, which have been widely used in industry for water purification. The research shows that through establishing a realistic computational model based on available experimental data, we are able to develop a deep understanding of membrane fouling mechanism. This knowledge is critical for providing a strategic plan for membrane experimental community and RO/NF industry for further improvements in membrane technology for water treatment. This dissertation focuses on three major research components (1) Development of the realistic molecular models, which could well represent the membrane surface properties; (2) Investigation of the interactions between the membrane surface and foulants by steered molecular dynamics simulations, in order to determine the major factors that contribute to surface fouling; and (3) Studies of the interactions between the surface-modified membranes (polyethylene glycol) to provide strategies for antifouling.

  18. Computer simulation of the micropulse imaging lidar

    NASA Astrophysics Data System (ADS)

    Dai, Yongjiang; Zhao, Hongwei; Zhao, Yu; Wang, Xiaoou

    2000-10-01

    In this paper a design method of the Micro Pulse Lidar (MPL) is introduced, that is a computer simulation of the MPL. Some of the MPL parameters concerned air scattered and the effects on the performance of the lidar are discussed. The design software for the lidar with diode pumped solid laser is programmed by MATLAB. This software is consisted of six modules, that is transmitter, atmosphere, target, receiver, processor and display system. The method can be extended some kinds of lidar.

  19. Computer simulation improves remedial cementing success

    SciTech Connect

    Kulakofsky, D.; Creel, P. )

    1992-11-01

    This paper reports that computer simulation has been used successfully to design remedial cement squeeze jobs and efficiently evaluate actual downhole performance and results. The program uses fluid properties, well parameters and wellbore configuration to estimate surface pressure at progressive stages of pumping operations. This new tool predicts surface pumping pressures in advance, allowing operators to effectively address changes that occur downhole during workover operations.

  20. Integrated computer simulation on FIR FEL dynamics

    SciTech Connect

    Furukawa, H.; Kuruma, S.; Imasaki, K.

    1995-12-31

    An integrated computer simulation code has been developed to analyze the RF-Linac FEL dynamics. First, the simulation code on the electron beam acceleration and transport processes in RF-Linac: (LUNA) has been developed to analyze the characteristics of the electron beam in RF-Linac and to optimize the parameters of RF-Linac. Second, a space-time dependent 3D FEL simulation code (Shipout) has been developed. The RF-Linac FEL total simulations have been performed by using the electron beam data from LUNA in Shipout. The number of particles using in a RF-Linac FEL total simulation is approximately 1000. The CPU time for the simulation of 1 round trip is about 1.5 minutes. At ILT/ILE, Osaka, a 8.5MeV RF-Linac with a photo-cathode RF-gun is used for FEL oscillation experiments. By using 2 cm wiggler, the FEL oscillation in the wavelength approximately 46 {mu}m are investigated. By the simulations using LUNA with the parameters of an ILT/ILE experiment, the pulse shape and the energy spectra of the electron beam at the end of the linac are estimated. The pulse shape of the electron beam at the end of the linac has sharp rise-up and it slowly decays as a function of time. By the RF-linac FEL total simulations with the parameters of an ILT/ILE experiment, the dependencies of the start up of the FEL oscillations on the pulse shape of the electron beam at the end of the linac are estimated. The coherent spontaneous emission effects and the quick start up of FEL oscillations have been observed by the RF-Linac FEL total simulations.

  1. Accelerating Climate Simulations Through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  2. Neural network computer simulation of medical aerosols.

    PubMed

    Richardson, C J; Barlow, D J

    1996-06-01

    Preliminary investigations have been conducted to assess the potential for using artificial neural networks to simulate aerosol behaviour, with a view to employing this type of methodology in the evaluation and design of pulmonary drug-delivery systems. Details are presented of the general purpose software developed for these tasks; it implements a feed-forward back-propagation algorithm with weight decay and connection pruning, the user having complete run-time control of the network architecture and mode of training. A series of exploratory investigations is then reported in which different network structures and training strategies are assessed in terms of their ability to simulate known patterns of fluid flow in simple model systems. The first of these involves simulations of cellular automata-generated data for fluid flow through a partially obstructed two-dimensional pipe. The artificial neural networks are shown to be highly successful in simulating the behaviour of this simple linear system, but with important provisos relating to the information content of the training data and the criteria used to judge when the network is properly trained. A second set of investigations is then reported in which similar networks are used to simulate patterns of fluid flow through aerosol generation devices, using training data furnished through rigorous computational fluid dynamics modelling. These more complex three-dimensional systems are modelled with equal success. It is concluded that carefully tailored, well trained networks could provide valuable tools not just for predicting but also for analysing the spatial dynamics of pharmaceutical aerosols.

  3. A computer simulation of chromosomal instability

    NASA Astrophysics Data System (ADS)

    Goodwin, E.; Cornforth, M.

    The transformation of a normal cell into a cancerous growth can be described as a process of mutation and selection occurring within the context of clonal expansion. Radiation, in addition to initial DNA damage, induces a persistent and still poorly understood genomic instability process that contributes to the mutational burden. It will be essential to include a quantitative description of this phenomenon in any attempt at science-based risk assessment. Monte Carlo computer simulations are a relatively simple way to model processes that are characterized by an element of randomness. A properly constructed simulation can capture the essence of a phenomenon that, as is often the case in biology, can be extraordinarily complex, and can do so even though the phenomenon itself is incompletely understood. A simple computer simulation of one manifestation of genomic instability known as chromosomal instability will be presented. The model simulates clonal expansion of a single chromosomally unstable cell into a colony. Instability is characterized by a single parameter, the rate of chromosomal rearrangement. With each new chromosome aberration, a unique subclone arises (subclones are defined as having a unique karyotype). The subclone initially has just one cell, but it can expand with cell division if the aberration is not lethal. The computer program automatically keeps track of the number of subclones within the expanding colony, and the number of cells within each subclone. Because chromosome aberrations kill some cells during colony growth, colonies arising from unstable cells tend to be smaller than those arising from stable cells. For any chosen level of instability, the computer program calculates the mean number of cells per colony averaged over many runs. These output should prove useful for investigating how such radiobiological phenomena as slow growth colonies, increased doubling time, and delayed cell death depend on chromosomal instability. Also of

  4. Agent-based services for B2B electronic commerce

    NASA Astrophysics Data System (ADS)

    Fong, Elizabeth; Ivezic, Nenad; Rhodes, Tom; Peng, Yun

    2000-12-01

    The potential of agent-based systems has not been realized yet, in part, because of the lack of understanding of how the agent technology supports industrial needs and emerging standards. The area of business-to-business electronic commerce (b2b e-commerce) is one of the most rapidly developing sectors of industry with huge impact on manufacturing practices. In this paper, we investigate the current state of agent technology and the feasibility of applying agent-based computing to b2b e-commerce in the circuit board manufacturing sector. We identify critical tasks and opportunities in the b2b e-commerce area where agent-based services can best be deployed. We describe an implemented agent-based prototype system to facilitate the bidding process for printed circuit board manufacturing and assembly. These activities are taking place within the Internet Commerce for Manufacturing (ICM) project, the NIST- sponsored project working with industry to create an environment where small manufacturers of mechanical and electronic components may participate competitively in virtual enterprises that manufacture printed circuit assemblies.

  5. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  6. Scalable, distributed data mining using an agent based architecture

    SciTech Connect

    Kargupta, H.; Hamzaoglu, I.; Stafford, B.

    1997-05-01

    Algorithm scalability and the distributed nature of both data and computation deserve serious attention in the context of data mining. This paper presents PADMA (PArallel Data Mining Agents), a parallel agent based system, that makes an effort to address these issues. PADMA contains modules for (1) parallel data accessing operations, (2) parallel hierarchical clustering, and (3) web-based data visualization. This paper describes the general architecture of PADMA and experimental results.

  7. Human shank experimental investigation and computer simulation

    NASA Astrophysics Data System (ADS)

    Krasnoschekov, Viktor V.; Maslov, Leonid B.

    2000-01-01

    A new combined approach to analyze a physiological state of the human shank is developed. Investigated vibration research complex records resonance curve of the shank tissues automatically for different kinds of vibration excitation and for various positions of the foot. A special computer model is implemented for the estimation of the experimental data, for a priori prognosis of the bio-object behavior and its dynamic characteristics in the case of various kinds and of different degrees of injury. The method is described by the viscous-elasticity non-homogeneous 1D continuum equation. It is solved by finite element method. The problem in shank cross-section is solved by boundary element method. The analysis of computer simulated resonance curves makes it possible to understand the experimental data correctly and to check the diagnostic criteria of the injury.

  8. Investigation of Carbohydrate Recognition via Computer Simulation

    SciTech Connect

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  9. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE PAGES

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; ...

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  10. Fast computation algorithms for speckle pattern simulation

    SciTech Connect

    Nascov, Victor; Samoilă, Cornel; Ursuţiu, Doru

    2013-11-13

    We present our development of a series of efficient computation algorithms, generally usable to calculate light diffraction and particularly for speckle pattern simulation. We use mainly the scalar diffraction theory in the form of Rayleigh-Sommerfeld diffraction formula and its Fresnel approximation. Our algorithms are based on a special form of the convolution theorem and the Fast Fourier Transform. They are able to evaluate the diffraction formula much faster than by direct computation and we have circumvented the restrictions regarding the relative sizes of the input and output domains, met on commonly used procedures. Moreover, the input and output planes can be tilted each to other and the output domain can be off-axis shifted.

  11. Parallel Proximity Detection for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  12. Parallel Proximity Detection for Computer Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  13. Trends in Social Science: The Impact of Computational and Simulative Models

    NASA Astrophysics Data System (ADS)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  14. Agent-based model for the h-index - exact solution

    NASA Astrophysics Data System (ADS)

    Żogała-Siudem, Barbara; Siudem, Grzegorz; Cena, Anna; Gagolewski, Marek

    2016-01-01

    Hirsch's h-index is perhaps the most popular citation-based measure of scientific excellence. In 2013, Ionescu and Chopard proposed an agent-based model describing a process for generating publications and citations in an abstract scientific community [G. Ionescu, B. Chopard, Eur. Phys. J. B 86, 426 (2013)]. Within such a framework, one may simulate a scientist's activity, and - by extension - investigate the whole community of researchers. Even though the Ionescu and Chopard model predicts the h-index quite well, the authors provided a solution based solely on simulations. In this paper, we complete their results with exact, analytic formulas. What is more, by considering a simplified version of the Ionescu-Chopard model, we obtained a compact, easy to compute formula for the h-index. The derived approximate and exact solutions are investigated on a simulated and real-world data sets.

  15. Computer simulations of charged colloids in confinement.

    PubMed

    Puertas, Antonio M; de las Nieves, F Javier; Cuetos, Alejandro

    2015-02-15

    We study by computer simulations the interaction between two similarly charged colloidal particles confined between parallel planes, in salt free conditions. Both the colloids and ions are simulated explicitly, in a fine-mesh lattice, and the electrostatic interaction is calculated using Ewald summation in two dimensions. The internal energy is measured by setting the colloidal particles at a given position and equilibrating the ions, whereas the free energy is obtained introducing a bias (attractive) potential between the colloids. Our results show that upon confining the system, the internal energy decreases, resulting in an attractive contribution to the interaction potential for large charges and strong confinement. However, the loss of entropy of the ions is the dominant mechanism in the interaction, irrespective of the confinement of the system. The interaction potential is therefore repulsive in all cases, and is well described by the DLVO functional form, but effective values have to be used for the interaction strength and Debye length.

  16. Computational simulation of the blood separation process.

    PubMed

    De Gruttola, Sandro; Boomsma, Kevin; Poulikakos, Dimos; Ventikos, Yiannis

    2005-08-01

    The aim of this work is to construct a computational fluid dynamics model capable of simulating the quasitransient process of apheresis. To this end a Lagrangian-Eulerian model has been developed which tracks the blood particles within a delineated two-dimensional flow domain. Within the Eulerian method, the fluid flow conservation equations within the separator are solved. Taking the calculated values of the flow field and using a Lagrangian method, the displacement of the blood particles is calculated. Thus, the local blood density within the separator at a given time step is known. Subsequently, the flow field in the separator is recalculated. This process continues until a quasisteady behavior is reached. The simulations show good agreement with experimental results. They shows a complete separation of plasma and red blood cells, as well as nearly complete separation of red blood cells and platelets. The white blood cells build clusters in the low concentrate cell bed.

  17. Computer simulation of solder joint failure

    SciTech Connect

    Burchett, S.N.; Frear, D.R.; Rashid, M.M.

    1997-04-01

    The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide the fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.

  18. Multiscale Computer Simulation of Failure in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  19. Computer Simulation of Fracture in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2006-01-01

    Aerogels are of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While the gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. In this work, we investigate the strength and fracture behavior of silica aerogels using a molecular statics-based computer simulation technique. The gels' structure is simulated via a Diffusion Limited Cluster Aggregation (DLCA) algorithm, which produces fractal structures representing experimentally observed aggregates of so-called secondary particles, themselves composed of amorphous silica primary particles an order of magnitude smaller. We have performed multi-length-scale simulations of fracture in silica aerogels, in which the interaction b e e n two secondary particles is assumed to be described by a Morse pair potential parameterized such that the potential range is much smaller than the secondary particle size. These Morse parameters are obtained by atomistic simulation of models of the experimentally-observed amorphous silica "bridges," with the fracture behavior of these bridges modeled via molecular statics using a Morse/Coulomb potential for silica. We consider the energetics of the fracture, and compare qualitative features of low-and high-density gel fracture.

  20. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  1. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  2. Computational simulation of liquid fuel rocket injectors

    NASA Technical Reports Server (NTRS)

    Landrum, D. Brian

    1994-01-01

    A major component of any liquid propellant rocket is the propellant injection system. Issues of interest include the degree of liquid vaporization and its impact on the combustion process, the pressure and temperature fields in the combustion chamber, and the cooling of the injector face and chamber walls. The Finite Difference Navier-Stokes (FDNS) code is a primary computational tool used in the MSFC Computational Fluid Dynamics Branch. The branch has dedicated a significant amount of resources to development of this code for prediction of both liquid and solid fuel rocket performance. The FDNS code is currently being upgraded to include the capability to model liquid/gas multi-phase flows for fuel injection simulation. An important aspect of this effort is benchmarking the code capabilities to predict existing experimental injection data. The objective of this MSFC/ASEE Summer Faculty Fellowship term was to evaluate the capabilities of the modified FDNS code to predict flow fields with liquid injection. Comparisons were made between code predictions and existing experimental data. A significant portion of the effort included a search for appropriate validation data. Also, code simulation deficiencies were identified.

  3. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  4. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  5. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  6. Computer simulation of fatigue under diametrical compression

    SciTech Connect

    Carmona, H. A.; Kun, F.; Andrade, J. S. Jr.; Herrmann, H. J.

    2007-04-15

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

  7. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1984-01-01

    All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

  8. Brief introductory guide to agent-based modeling and an illustration from urban health research.

    PubMed

    Auchincloss, Amy H; Garcia, Leandro Martin Totaro

    2015-11-01

    There is growing interest among urban health researchers in addressing complex problems using conceptual and computation models from the field of complex systems. Agent-based modeling (ABM) is one computational modeling tool that has received a lot of interest. However, many researchers remain unfamiliar with developing and carrying out an ABM, hindering the understanding and application of it. This paper first presents a brief introductory guide to carrying out a simple agent-based model. Then, the method is illustrated by discussing a previously developed agent-based model, which explored inequalities in diet in the context of urban residential segregation.

  9. Proceedings of the 1990 Summer computer simulation conference

    SciTech Connect

    Svrcek, B.; McRae, J.

    1990-01-01

    This book covers simulation methodologies, computer systems and applications that will serve the simulation practitioner for the next decade. Specifically, the simulation applications range from Computer-Integrated-Manufacturing, Computer-Aided-Design, Radar and Communications, Transportation, Biomedical, Energy and the Environment, Government/Management and Social Sciences, and Training Simulators to Aerospace, Missiles and SDI. Additionally, new approaches to simulation are offered by neural networks, expert systems and parallel processing. Two applications deal with these new approaches, Intelligent Simulation Environments and Advanced Information Processing and Simulation.

  10. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1983-01-01

    Chip level modeling techniques, functional fault simulation, simulation software development, a more efficient, high level version of GSP, and a parallel architecture for functional simulation are discussed.

  11. A Mass Spectrometer Simulator in Your Computer

    NASA Astrophysics Data System (ADS)

    Gagnon, Michel

    2012-12-01

    Introduced to study components of ionized gas, the mass spectrometer has evolved into a highly accurate device now used in many undergraduate and research laboratories. Unfortunately, despite their importance in the formation of future scientists, mass spectrometers remain beyond the financial reach of many high schools and colleges. As a result, it is not possible for instructors to take full advantage of this equipment. Therefore, to facilitate accessibility to this tool, we have developed a realistic computer-based simulator. Using this software, students are able to practice their ability to identify the components of the original gas, thereby gaining a better understanding of the underlying physical laws. The software is available as a free download.

  12. Miller experiments in atomistic computer simulations

    PubMed Central

    Saitta, Antonino Marco; Saija, Franz

    2014-01-01

    The celebrated Miller experiments reported on the spontaneous formation of amino acids from a mixture of simple molecules reacting under an electric discharge, giving birth to the research field of prebiotic chemistry. However, the chemical reactions involved in those experiments have never been studied at the atomic level. Here we report on, to our knowledge, the first ab initio computer simulations of Miller-like experiments in the condensed phase. Our study, based on the recent method of treatment of aqueous systems under electric fields and on metadynamics analysis of chemical reactions, shows that glycine spontaneously forms from mixtures of simple molecules once an electric field is switched on and identifies formic acid and formamide as key intermediate products of the early steps of the Miller reactions, and the crucible of formation of complex biological molecules. PMID:25201948

  13. Protein Dynamics from NMR and Computer Simulation

    NASA Astrophysics Data System (ADS)

    Wu, Qiong; Kravchenko, Olga; Kemple, Marvin; Likic, Vladimir; Klimtchuk, Elena; Prendergast, Franklyn

    2002-03-01

    Proteins exhibit internal motions from the millisecond to sub-nanosecond time scale. The challenge is to relate these internal motions to biological function. A strategy to address this aim is to apply a combination of several techniques including high-resolution NMR, computer simulation of molecular dynamics (MD), molecular graphics, and finally molecular biology, the latter to generate appropriate samples. Two difficulties that arise are: (1) the time scale which is most directly biologically relevant (ms to μs) is not readily accessible by these techniques and (2) the techniques focus on local and not collective motions. We will outline methods using ^13C-NMR to help alleviate the second problem, as applied to intestinal fatty acid binding protein, a relatively small intracellular protein believed to be involved in fatty acid transport and metabolism. This work is supported in part by PHS Grant GM34847 (FGP) and by a fellowship from the American Heart Association (QW).

  14. Ceramic matrix composite behavior -- Computational simulation

    SciTech Connect

    Chamis, C.C.; Murthy, P.L.N.; Mital, S.K.

    1996-10-01

    Development of analytical modeling and computational capabilities for the prediction of high temperature ceramic matrix composite behavior has been an ongoing research activity at NASA-Lewis Research Center. These research activities have resulted in the development of micromechanics based methodologies to evaluate different aspects of ceramic matrix composite behavior. The basis of the approach is micromechanics together with a unique fiber substructuring concept. In this new concept the conventional unit cell (the smallest representative volume element of the composite) of micromechanics approach has been modified by substructuring the unit cell into several slices and developing the micromechanics based equations at the slice level. Main advantage of this technique is that it can provide a much greater detail in the response of composite behavior as compared to a conventional micromechanics based analysis and still maintains a very high computational efficiency. This methodology has recently been extended to model plain weave ceramic composites. The objective of the present paper is to describe the important features of the modeling and simulation and illustrate with select examples of laminated as well as woven composites.

  15. Experiential Learning through Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Maynes, Bill; And Others

    1992-01-01

    Describes experiential learning instructional model and simulation for student principals. Describes interactive laser videodisc simulation. Reports preliminary findings about student principal learning from simulation. Examines learning approaches by unsuccessful and successful students and learning levels of model learners. Simulation's success…

  16. An agent-based model for domestic water management in Valladolid metropolitan area

    NASA Astrophysics Data System (ADS)

    GaláN, José M.; López-Paredes, Adolfo; Del Olmo, Ricardo

    2009-05-01

    In this work we demonstrate that the combination of agent-based modeling and simulation constitutes a useful methodological approach to dealing with the complexity derived from multiple factors with influence in the domestic water management in emergent metropolitan areas. In particular, we adapt and integrate different social submodels, models of urban dynamics, water consumption, and technological and opinion diffusion, in an agent-based model that is, in turn, linked with a geographic information system. The result is a computational environment that enables simulating and comparing various water demand scenarios. We have parameterized our general model for the metropolitan area of Valladolid (Spain).The model shows the influence of urban dynamics (e.g., intrapopulation movements, residence typology, and changes in the territorial model) and other socio-geographic effects (technological and opinion dynamics) in domestic water demand. The conclusions drawn in this way would have been difficult to obtain using other approaches, such as conventional forecasting methods, given the need to integrate different socioeconomic and geographic aspects in one single model. We illustrate that the described methodology can complement conventional approaches, providing descriptive and formal additional insights into domestic water demand management problems.

  17. A mathematical framework for agent based models of complex biological networks.

    PubMed

    Hinkelmann, Franziska; Murrugarra, David; Jarrah, Abdul Salam; Laubenbacher, Reinhard

    2011-07-01

    Agent-based modeling and simulation is a useful method to study biological phenomena in a wide range of fields, from molecular biology to ecology. Since there is currently no agreed-upon standard way to specify such models, it is not always easy to use published models. Also, since model descriptions are not usually given in mathematical terms, it is difficult to bring mathematical analysis tools to bear, so that models are typically studied through simulation. In order to address this issue, Grimm et al. proposed a protocol for model specification, the so-called ODD protocol, which provides a standard way to describe models. This paper proposes an addition to the ODD protocol which allows the description of an agent-based model as a dynamical system, which provides access to computational and theoretical tools for its analysis. The mathematical framework is that of algebraic models, that is, time-discrete dynamical systems with algebraic structure. It is shown by way of several examples how this mathematical specification can help with model analysis. This mathematical framework can also accommodate other model types such as Boolean networks and the more general logical models, as well as Petri nets.

  18. Space Shuttle flight crew/computer interface simulation studies.

    NASA Technical Reports Server (NTRS)

    Callihan, J. C.; Rybarczyk, D. T.

    1972-01-01

    An approach to achieving an optimized set of crew/computer interface requirements on the Space Shuttle program is described. It consists of defining the mission phases and crew timelines, developing a functional description of the crew/computer interface displays and controls software, conducting real-time simulations using pilot evaluation of the interface displays and controls, and developing a set of crew/computer functional requirements specifications. The simulator is a two-man crew station which includes three CRTs with keyboards for simulating the crew/computer interface. The programs simulate the mission phases and the flight hardware, including the flight computer and CRT displays.

  19. Comparing Computer Run Time of Building Simulation Programs

    SciTech Connect

    Hong, Tianzhen; Buhl, Fred; Haves, Philip; Selkowitz, Stephen; Wetter, Michael

    2008-07-23

    This paper presents an approach to comparing computer run time of building simulation programs. The computing run time of a simulation program depends on several key factors, including the calculation algorithm and modeling capabilities of the program, the run period, the simulation time step, the complexity of the energy models, the run control settings, and the software and hardware configurations of the computer that is used to make the simulation runs. To demonstrate the approach, simulation runs are performed for several representative DOE-2.1E and EnergyPlus energy models. The computer run time of these energy models are then compared and analyzed.

  20. Engineering Fracking Fluids with Computer Simulation

    NASA Astrophysics Data System (ADS)

    Shaqfeh, Eric

    2015-11-01

    There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.

  1. Computer-aided Instructional System for Transmission Line Simulation.

    ERIC Educational Resources Information Center

    Reinhard, Erwin A.; Roth, Charles H., Jr.

    A computer-aided instructional system has been developed which utilizes dynamic computer-controlled graphic displays and which requires student interaction with a computer simulation in an instructional mode. A numerical scheme has been developed for digital simulation of a uniform, distortionless transmission line with resistive terminations and…

  2. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  3. Computer-aided simulation study of photomultiplier tubes

    NASA Technical Reports Server (NTRS)

    Zaghloul, Mona E.; Rhee, Do Jun

    1989-01-01

    A computer model that simulates the response of photomultiplier tubes (PMTs) and the associated voltage divider circuit is developed. An equivalent circuit that approximates the operation of the device is derived and then used to develop a computer simulation of the PMT. Simulation results are presented and discussed.

  4. Agent-based models of financial markets

    NASA Astrophysics Data System (ADS)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  5. Computer simulation and the features of novel empirical data.

    PubMed

    Lusk, Greg

    2016-04-01

    In an attempt to determine the epistemic status of computer simulation results, philosophers of science have recently explored the similarities and differences between computer simulations and experiments. One question that arises is whether and, if so, when, simulation results constitute novel empirical data. It is often supposed that computer simulation results could never be empirical or novel because simulations never interact with their targets, and cannot go beyond their programming. This paper argues against this position by examining whether, and under what conditions, the features of empiricality and novelty could be displayed by computer simulation data. I show that, to the extent that certain familiar measurement results have these features, so can some computer simulation results.

  6. Simulating granular media on the computer

    NASA Astrophysics Data System (ADS)

    Herrmann, H. J.

    Granular materials, like sand or powder, can present very intriguing effects. When shaken, sheared or poured they show segregation, convection and spontaneous fluctuations in densities and stresses. I will discuss the modeling of a granular medium on a computer by simulating a packing of elastic spheres via Molecular Dynamics. Dissipation of energy and shear friction at collisions are included. In the physical range the friction coefficient is found to be a linear function of the angle of repose. On a vibrating plate the formation of convection cells due to walls or amplitude modulations can be observed. The onset of fluidization can be determined and is in good agreement with experiments. Segregation of larger particles is found to be always accompanied by convection cells. There is also ample experimental evidence showing the existence of spontaneous density patterns in granular material flowing through pipes or hoppers. The Molecular Dynamics simulations show that these density fluctuations follow a 1/f α spectrum. I compare this behavior to deterministic one-dimensional traffic models. A model with continuous positions and velocities shows self-organized critical jamming behind a slower car. The experimentally observed effects are also reproduced by Lattice Gas and Boltzmann Lattice Models. Density waves are spontaneously generated when the viscosity has a nonlinear dependence on density which characterizes granular flow. We also briefly sketch a thermodynamic formalism for loose granular material. In a dense packing non-linear acoustic phenomena, like the pressure dependence of the sound velocity are studied. Finally the plastic shear bands occurring in large scale deformations of compactified granular media are investigated using an explicit Lagrangian technique.

  7. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    NASA Astrophysics Data System (ADS)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  8. Developing a multiscale, multi-resolution agent-based brain tumor model by graphics processing units

    PubMed Central

    2011-01-01

    Multiscale agent-based modeling (MABM) has been widely used to simulate Glioblastoma Multiforme (GBM) and its progression. At the intracellular level, the MABM approach employs a system of ordinary differential equations to describe quantitatively specific intracellular molecular pathways that determine phenotypic switches among cells (e.g. from migration to proliferation and vice versa). At the intercellular level, MABM describes cell-cell interactions by a discrete module. At the tissue level, partial differential equations are employed to model the diffusion of chemoattractants, which are the input factors of the intracellular molecular pathway. Moreover, multiscale analysis makes it possible to explore the molecules that play important roles in determining the cellular phenotypic switches that in turn drive the whole GBM expansion. However, owing to limited computational resources, MABM is currently a theoretical biological model that uses relatively coarse grids to simulate a few cancer cells in a small slice of brain cancer tissue. In order to improve this theoretical model to simulate and predict actual GBM cancer progression in real time, a graphics processing unit (GPU)-based parallel computing algorithm was developed and combined with the multi-resolution design to speed up the MABM. The simulated results demonstrated that the GPU-based, multi-resolution and multiscale approach can accelerate the previous MABM around 30-fold with relatively fine grids in a large extracellular matrix. Therefore, the new model has great potential for simulating and predicting real-time GBM progression, if real experimental data are incorporated. PMID:22176732

  9. Computer simulation of FCC riser reactors.

    SciTech Connect

    Chang, S. L.; Golchert, B.; Lottes, S. A.; Petrick, M.; Zhou, C. Q.

    1999-04-20

    A three-dimensional computational fluid dynamics (CFD) code, ICRKFLO, was developed to simulate the multiphase reacting flow system in a fluid catalytic cracking (FCC) riser reactor. The code solve flow properties based on fundamental conservation laws of mass, momentum, and energy for gas, liquid, and solid phases. Useful phenomenological models were developed to represent the controlling FCC processes, including droplet dispersion and evaporation, particle-solid interactions, and interfacial heat transfer between gas, droplets, and particles. Techniques were also developed to facilitate numerical calculations. These techniques include a hybrid flow-kinetic treatment to include detailed kinetic calculations, a time-integral approach to overcome numerical stiffness problems of chemical reactions, and a sectional coupling and blocked-cell technique for handling complex geometry. The copyrighted ICRKFLO software has been validated with experimental data from pilot- and commercial-scale FCC units. The code can be used to evaluate the impacts of design and operating conditions on the production of gasoline and other oil products.

  10. Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis.

    PubMed

    Kurhekar, Manish; Deshpande, Umesh

    2016-01-01

    Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website.

  11. Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis

    PubMed Central

    2016-01-01

    Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website. PMID:27340402

  12. Computer Simulation Methods for Defect Configurations and Nanoscale Structures

    SciTech Connect

    Gao, Fei

    2010-01-01

    This chapter will describe general computer simulation methods, including ab initio calculations, molecular dynamics and kinetic Monte-Carlo method, and their applications to the calculations of defect configurations in various materials (metals, ceramics and oxides) and the simulations of nanoscale structures due to ion-solid interactions. The multiscale theory, modeling, and simulation techniques (both time scale and space scale) will be emphasized, and the comparisons between computer simulation results and exprimental observations will be made.

  13. A Hybrid Sensitivity Analysis Approach for Agent-based Disease Spread Models

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. Of particular interest lately is the application of agent-based and hybrid models to epidemiology, specifically Agent-based Disease Spread Models (ABDSM). Validation (one aspect of the means to achieve dependability) of ABDSM simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. In this report, we describe our preliminary efforts in ABDSM validation by using hybrid model fusion technology.

  14. Agent-based modeling: case study in cleavage furrow models.

    PubMed

    Mogilner, Alex; Manhart, Angelika

    2016-11-07

    The number of studies in cell biology in which quantitative models accompany experiments has been growing steadily. Roughly, mathematical and computational techniques of these models can be classified as "differential equation based" (DE) or "agent based" (AB). Recently AB models have started to outnumber DE models, but understanding of AB philosophy and methodology is much less widespread than familiarity with DE techniques. Here we use the history of modeling a fundamental biological problem-positioning of the cleavage furrow in dividing cells-to explain how and why DE and AB models are used. We discuss differences, advantages, and shortcomings of these two approaches.

  15. Agent-Based Distributed Data Mining: A Survey

    NASA Astrophysics Data System (ADS)

    Moemeng, Chayapol; Gorodetsky, Vladimir; Zuo, Ziye; Yang, Yong; Zhang, Chengqi

    Distributed data mining is originated from the need of mining over decentralised data sources. Data mining techniques involving in such complex environment must encounter great dynamics due to changes in the system can affect the overall performance of the system. Agent computing whose aim is to deal with complex systems has revealed opportunities to improve distributed data mining systems in a number of ways. This paper surveys the integration of multi-agent system and distributed data mining, also known as agent-based distributed data mining, in terms of significance, system overview, existing systems, and research trends.

  16. Computing Environment for Adaptive Multiscale Simulation

    DTIC Science & Technology

    2014-09-24

    Computation Research Center (SCOREC). The primary component is a parallel computing cluster with 22 Dell R620 compute nodes, each with two 8-core...cluster with 22 Dell R620 compute nodes, each with two 8-core 2.6 GHz Intel Xeon processors (352 processors) and a direct connection to both a 56Gbps...compute  cluster  purchased  with  the  DURIP  funds  consists  of  22   Dell  R620  compute  nodes,  each  with  two  8

  17. Multiscale agent-based consumer market modeling.

    SciTech Connect

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  18. Assurance in Agent-Based Systems

    SciTech Connect

    Gilliom, Laura R.; Goldsmith, Steven Y.

    1999-05-10

    Our vision of the future of information systems is one that includes engineered collectives of software agents which are situated in an environment over years and which increasingly improve the performance of the overall system of which they are a part. At a minimum, the movement of agent and multi-agent technology into National Security applications, including their use in information assurance, is apparent today. The use of deliberative, autonomous agents in high-consequence/high-security applications will require a commensurate level of protection and confidence in the predictability of system-level behavior. At Sandia National Laboratories, we have defined and are addressing a research agenda that integrates the surety (safety, security, and reliability) into agent-based systems at a deep level. Surety is addressed at multiple levels: The integrity of individual agents must be protected by addressing potential failure modes and vulnerabilities to malevolent threats. Providing for the surety of the collective requires attention to communications surety issues and mechanisms for identifying and working with trusted collaborators. At the highest level, using agent-based collectives within a large-scale distributed system requires the development of principled design methods to deliver the desired emergent performance or surety characteristics. This position paper will outline the research directions underway at Sandia, will discuss relevant work being performed elsewhere, and will report progress to date toward assurance in agent-based systems.

  19. Exploring the Use of Computer Simulations in Unraveling Research and Development Governance Problems

    NASA Technical Reports Server (NTRS)

    Balaban, Mariusz A.; Hester, Patrick T.

    2012-01-01

    Understanding Research and Development (R&D) enterprise relationships and processes at a governance level is not a simple task, but valuable decision-making insight and evaluation capabilities can be gained from their exploration through computer simulations. This paper discusses current Modeling and Simulation (M&S) methods, addressing their applicability to R&D enterprise governance. Specifically, the authors analyze advantages and disadvantages of the four methodologies used most often by M&S practitioners: System Dynamics (SO), Discrete Event Simulation (DES), Agent Based Modeling (ABM), and formal Analytic Methods (AM) for modeling systems at the governance level. Moreover, the paper describes nesting models using a multi-method approach. Guidance is provided to those seeking to employ modeling techniques in an R&D enterprise for the purposes of understanding enterprise governance. Further, an example is modeled and explored for potential insight. The paper concludes with recommendations regarding opportunities for concentration of future work in modeling and simulating R&D governance relationships and processes.

  20. Smell Detection Agent Based Optimization Algorithm

    NASA Astrophysics Data System (ADS)

    Vinod Chandra, S. S.

    2016-09-01

    In this paper, a novel nature-inspired optimization algorithm has been employed and the trained behaviour of dogs in detecting smell trails is adapted into computational agents for problem solving. The algorithm involves creation of a surface with smell trails and subsequent iteration of the agents in resolving a path. This algorithm can be applied in different computational constraints that incorporate path-based problems. Implementation of the algorithm can be treated as a shortest path problem for a variety of datasets. The simulated agents have been used to evolve the shortest path between two nodes in a graph. This algorithm is useful to solve NP-hard problems that are related to path discovery. This algorithm is also useful to solve many practical optimization problems. The extensive derivation of the algorithm can be enabled to solve shortest path problems.

  1. An Agent Based Model for Social Class Emergence

    NASA Astrophysics Data System (ADS)

    Yang, Xiaoxiang; Rodriguez Segura, Daniel; Lin, Fei; Mazilu, Irina

    We present an open system agent-based model to analyze the effects of education and the society-specific wealth transactions on the emergence of social classes. Building on previous studies, we use realistic functions to model how years of education affect the income level. Numerical simulations show that the fraction of an individual's total transactions that is invested rather than consumed can cause wealth gaps between different income brackets in the long run. In an attempt to incorporate the network effects, we also explore how the probability of interactions among agents depending on the spread of their income brackets affects wealth distribution.

  2. NISAC Agent Based Laboratory for Economics

    SciTech Connect

    Downes, Paula; Davis, Chris; Eidson, Eric; Ehlen, Mark; Gieseler, Charles; Harris, Richard

    2006-10-11

    The software provides large-scale microeconomic simulation of complex economic and social systems (such as supply chain and market dynamics of businesses in the US economy) and their dependence on physical infrastructure systems. The system is based on Agent simulation, where each entity of inteest in the system to be modeled (for example, a Bank, individual firms, Consumer households, etc.) is specified in a data-driven sense to be individually repreented by an Agent. The Agents interact using rules of interaction appropriate to their roles, and through those interactions complex economic and social dynamics emerge. The software is implemented in three tiers, a Java-based visualization client, a C++ control mid-tier, and a C++ computational tier.

  3. Simulation of reliability in multiserver computer networks

    NASA Astrophysics Data System (ADS)

    Minkevičius, Saulius

    2012-11-01

    The performance in terms of reliability of computer multiserver networks motivates this paper. The probability limit theorem is derived on the extreme queue length in open multiserver queueing networks in heavy traffic and applied to a reliability model for multiserver computer networks where we relate the time of failure of a multiserver computer network to the system parameters.

  4. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.; Ziegler, C.

    1983-01-01

    A software simulator to help NASA in the design of the LMSS was developed. The simulator will be used to study the characteristics of implementation requirements of the LMSS's configuration with specifications as outlined by NASA.

  5. How Effective Is Instructional Support for Learning with Computer Simulations?

    ERIC Educational Resources Information Center

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  6. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  7. Computer Simulation (Microcultures): An Effective Model for Multicultural Education.

    ERIC Educational Resources Information Center

    Nelson, Jorge O.

    This paper presents a rationale for using high-fidelity computer simulation in planning for and implementing effective multicultural education strategies. Using computer simulation, educators can begin to understand and plan for the concept of cultural sensitivity in delivering instruction. The model promises to emphasize teachers' understanding…

  8. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  9. The Role of Computer Simulations in Engineering Education.

    ERIC Educational Resources Information Center

    Smith, P. R.; Pollard, D.

    1986-01-01

    Discusses role of computer simulation in complementing and extending conventional components of undergraduate engineering education process in United Kingdom universities and polytechnics. Aspects of computer-based learning are reviewed (laboratory simulation, lecture and tutorial support, inservice teacher education) with reference to programs in…

  10. Reducing the complexity of an agent-based local heroin market model.

    PubMed

    Heard, Daniel; Bobashev, Georgiy V; Morris, Robert J

    2014-01-01

    This project explores techniques for reducing the complexity of an agent-based model (ABM). The analysis involved a model developed from the ethnographic research of Dr. Lee Hoffer in the Larimer area heroin market, which involved drug users, drug sellers, homeless individuals and police. The authors used statistical techniques to create a reduced version of the original model which maintained simulation fidelity while reducing computational complexity. This involved identifying key summary quantities of individual customer behavior as well as overall market activity and replacing some agents with probability distributions and regressions. The model was then extended to allow external market interventions in the form of police busts. Extensions of this research perspective, as well as its strengths and limitations, are discussed.

  11. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  12. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  13. Techniques and Issues in Agent-Based Modeling Validation

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

  14. Case Studies in Computer Adaptive Test Design through Simulation.

    ERIC Educational Resources Information Center

    Eignor, Daniel R.; And Others

    The extensive computer simulation work done in developing the computer adaptive versions of the Graduate Record Examinations (GRE) Board General Test and the College Board Admissions Testing Program (ATP) Scholastic Aptitude Test (SAT) is described in this report. Both the GRE General and SAT computer adaptive tests (CATs), which are fixed length…

  15. Interactive Electronic Circuit Simulation on Small Computer Systems

    DTIC Science & Technology

    1979-11-01

    this is the most effective way of completing a computer-aided engineering design cycle. Compar- isons of the interactive versus batch simulation...run on almost any computer system with few if any modifications. Also included are the four benchmark test circuits which were used in many of the...the ensuing FORTRAN version. 2.2 Circuit Simulation Using BIAS-D (BASIC Version) Any circuit-simulation program can be di- vided into three

  16. Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)

    2004-01-01

    These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.

  17. Digital computer simulation of synthetic aperture systems and images

    NASA Astrophysics Data System (ADS)

    Camporeale, Claudio; Galati, Gaspare

    1991-06-01

    Digital computer simulation is a powerful tool for the design, the mission planning and the image quality analysis of advanced SAR Systems. 'End-to-end' simulators describe the whole process of the SAR imaging including the generation of the coherent echoes and their processing and allow, unlike the 'product simulators', to evaluate the effects of the various impairments on the final image. The main disadvantage of the 'end-to-end' approach, as described in this paper, is the heavy computation burden; therefore, a new type of simulator is presented, attempting to reduce the burden but presenting a greater degree of completeness and realism than the SAR product simulators, already existing.

  18. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  19. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  20. A Review of Computer Simulations in Teacher Education

    ERIC Educational Resources Information Center

    Bradley, Elizabeth Gates; Kendall, Brittany

    2014-01-01

    Computer simulations can provide guided practice for a variety of situations that pre-service teachers would not frequently experience during their teacher education studies. Pre-service teachers can use simulations to turn the knowledge they have gained in their coursework into real experience. Teacher simulation training has come a long way over…

  1. Computer Simulations as an Integral Part of Intermediate Macroeconomics.

    ERIC Educational Resources Information Center

    Millerd, Frank W.; Robertson, Alastair R.

    1987-01-01

    Describes the development of two interactive computer simulations which were fully integrated with other course materials. The simulations illustrate the effects of various real and monetary "demand shocks" on aggregate income, interest rates, and components of spending and economic output. Includes an evaluation of the simulations'…

  2. Genetic crossing vs cloning by computer simulation

    SciTech Connect

    Dasgupta, S.

    1997-06-01

    We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  3. Spatial Learning and Computer Simulations in Science

    ERIC Educational Resources Information Center

    Lindgren, Robb; Schwartz, Daniel L.

    2009-01-01

    Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…

  4. Computer formulations of aircraft models for simulation studies

    NASA Technical Reports Server (NTRS)

    Howard, J. C.

    1979-01-01

    Recent developments in formula manipulation compilers and the design of several symbol manipulation languages, enable computers to be used for symbolic mathematical computation. A computer system and language that can be used to perform symbolic manipulations in an interactive mode are used to formulate a mathematical model of an aeronautical system. The example demonstrates that once the procedure is established, the formulation and modification of models for simulation studies can be reduced to a series of routine computer operations.

  5. Radiotherapy Monte Carlo simulation using cloud computing technology.

    PubMed

    Poole, C M; Cornelius, I; Trapp, J V; Langton, C M

    2012-12-01

    Cloud computing allows for vast computational resources to be leveraged quickly and easily in bursts as and when required. Here we describe a technique that allows for Monte Carlo radiotherapy dose calculations to be performed using GEANT4 and executed in the cloud, with relative simulation cost and completion time evaluated as a function of machine count. As expected, simulation completion time decreases as 1/n for n parallel machines, and relative simulation cost is found to be optimal where n is a factor of the total simulation time in hours. Using the technique, we demonstrate the potential usefulness of cloud computing as a solution for rapid Monte Carlo simulation for radiotherapy dose calculation without the need for dedicated local computer hardware as a proof of principal.

  6. Methodology of modeling and measuring computer architectures for plasma simulations

    NASA Technical Reports Server (NTRS)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  7. Computer-Based Simulation for Man-Computer System Design,

    DTIC Science & Technology

    1980-02-01

    simulations to Investigate huwan factors and crew size (2). The experiment design was a three- problem posed by man omputer interactions in proposed ...hesrighat of t the reflected in iess flying ties, fewer Instances of high Lto are carfger The lD haud tlo desilt itr wihthe speed chis,* and hence, reduced

  8. High Fidelity Simulation of a Computer Room

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim; Chan, William; Chaderjian, Neal; Pandya, Shishir

    2005-01-01

    This viewgraph presentation reviews NASA's Columbia supercomputer and the mesh technology used to test the adequacy of the fluid and cooling of a computer room. A technical description of the Columbia supercomputer is also presented along with its performance capability.

  9. Some theoretical issues on computer simulations

    SciTech Connect

    Barrett, C.L.; Reidys, C.M.

    1998-02-01

    The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

  10. Computer simulation results of attitude estimation of earth orbiting satellites

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1976-01-01

    Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

  11. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  12. Super-computer simulation for galaxy formation

    NASA Astrophysics Data System (ADS)

    Jing, Yipeng

    2001-06-01

    Numerical simulations are widely used in the studies of galaxy formation. Here we briefly review their important role in the galaxy formation research, their relations with analytical models, and their limitations as well. Then a progress report is given about our collaboration with a group in the University of Tokyo, including the simulation samples we have obtained, some of the results we have published, and the joint projects which are in progress.

  13. Computer simulation of water reclamation processors

    NASA Technical Reports Server (NTRS)

    Fisher, John W.; Hightower, T. M.; Flynn, Michael T.

    1991-01-01

    The development of detailed simulation models of water reclamation processors based on the ASPEN PLUS simulation program is discussed. Individual models have been developed for vapor compression distillation, vapor phase catalytic ammonia removal, and supercritical water oxidation. These models are used for predicting the process behavior. Particular attention is given to methodology which is used to complete this work, and the insights which are gained by this type of model development.

  14. Computer simulation of current voltage response of electrocatalytic sensor

    NASA Astrophysics Data System (ADS)

    Jasinski, Piotr; Jasinski, Grzegorz; Chachulski, Bogdan; Nowakowski, Antoni

    2003-09-01

    In the present paper, results of computer simulation of cyclic voltammetry applied to electrocatalytic solid state sensor are presented. The computer software developed by D.Gosser is based on explicit finite difference method. The software is devoted for the simulation of cyclic voltammetry experiments in liquid electrochemistry. However the software is based on general electrochemical rules and may be used for simulation of experiments in solid state electrochemistry. The electrocatalytic sensor does not have a reference electrode and therefore it is necessary to employ virtual reference electrode into the model of the sensor. Data obtained from simulation are similar to measurement one what confirms correctness of assumed sensing mechanism.

  15. Two inviscid computational simulations of separated flow about airfoils

    NASA Technical Reports Server (NTRS)

    Barnwell, R. W.

    1976-01-01

    Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

  16. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  17. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  18. Emergence of a Snake-Like Structure in Mobile Distributed Agents: An Exploratory Agent-Based Modeling Approach

    PubMed Central

    Niazi, Muaz A.

    2014-01-01

    The body structure of snakes is composed of numerous natural components thereby making it resilient, flexible, adaptive, and dynamic. In contrast, current computer animations as well as physical implementations of snake-like autonomous structures are typically designed to use either a single or a relatively smaller number of components. As a result, not only these artificial structures are constrained by the dimensions of the constituent components but often also require relatively more computationally intensive algorithms to model and animate. Still, these animations often lack life-like resilience and adaptation. This paper presents a solution to the problem of modeling snake-like structures by proposing an agent-based, self-organizing algorithm resulting in an emergent and surprisingly resilient dynamic structure involving a minimal of interagent communication. Extensive simulation experiments demonstrate the effectiveness as well as resilience of the proposed approach. The ideas originating from the proposed algorithm can not only be used for developing self-organizing animations but can also have practical applications such as in the form of complex, autonomous, evolvable robots with self-organizing, mobile components with minimal individual computational capabilities. The work also demonstrates the utility of exploratory agent-based modeling (EABM) in the engineering of artificial life-like complex adaptive systems. PMID:24701135

  19. Emergence of a snake-like structure in mobile distributed agents: an exploratory agent-based modeling approach.

    PubMed

    Niazi, Muaz A

    2014-01-01

    The body structure of snakes is composed of numerous natural components thereby making it resilient, flexible, adaptive, and dynamic. In contrast, current computer animations as well as physical implementations of snake-like autonomous structures are typically designed to use either a single or a relatively smaller number of components. As a result, not only these artificial structures are constrained by the dimensions of the constituent components but often also require relatively more computationally intensive algorithms to model and animate. Still, these animations often lack life-like resilience and adaptation. This paper presents a solution to the problem of modeling snake-like structures by proposing an agent-based, self-organizing algorithm resulting in an emergent and surprisingly resilient dynamic structure involving a minimal of interagent communication. Extensive simulation experiments demonstrate the effectiveness as well as resilience of the proposed approach. The ideas originating from the proposed algorithm can not only be used for developing self-organizing animations but can also have practical applications such as in the form of complex, autonomous, evolvable robots with self-organizing, mobile components with minimal individual computational capabilities. The work also demonstrates the utility of exploratory agent-based modeling (EABM) in the engineering of artificial life-like complex adaptive systems.

  20. An Exercise in Biometrical Genetics Based on a Computer Simulation.

    ERIC Educational Resources Information Center

    Murphy, P. J.

    1983-01-01

    Describes an exercise in biometrical genetics based on the noninteractive use of a computer simulation of a wheat hydridization program. Advantages of using the material in this way are also discussed. (Author/JN)

  1. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix.

  2. MINEXP, A Computer-Simulated Mineral Exploration Program

    ERIC Educational Resources Information Center

    Smith, Michael J.; And Others

    1978-01-01

    This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

  3. A Computing Cluster for Numerical Simulation

    DTIC Science & Technology

    2006-10-23

    34Contact and Friction for Cloth Animation", SIGGRAPH 2002, ACM TOG 21, 594-603 (2002). "* [BHTF] Bao, Z., Hong, J.-M., Teran , J. and Fedkiw, R...Simulation of Large Bodies of Water by Coupling Two and Three Dimensional Techniques", SIGGRAPH 2006, ACM TOG 25, 805-811 (2006). "* [ITF] Irving, G., Teran ...O’Brien (2006) "* [TSBNLF] Teran , J., Sifakis, E., Blemker, S., Ng Thow Hing, V., Lau, C. and Fedkiw, R., "Creating and Simulating Skeletal Muscle from the

  4. An agent-based approach to financial stylized facts

    NASA Astrophysics Data System (ADS)

    Shimokawa, Tetsuya; Suzuki, Kyoko; Misawa, Tadanobu

    2007-06-01

    An important challenge of the financial theory in recent years is to construct more sophisticated models which have consistencies with as many financial stylized facts that cannot be explained by traditional models. Recently, psychological studies on decision making under uncertainty which originate in Kahneman and Tversky's research attract a lot of interest as key factors which figure out the financial stylized facts. These psychological results have been applied to the theory of investor's decision making and financial equilibrium modeling. This paper, following these behavioral financial studies, would like to propose an agent-based equilibrium model with prospect theoretical features of investors. Our goal is to point out a possibility that loss-averse feature of investors explains vast number of financial stylized facts and plays a crucial role in price formations of financial markets. Price process which is endogenously generated through our model has consistencies with, not only the equity premium puzzle and the volatility puzzle, but great kurtosis, asymmetry of return distribution, auto-correlation of return volatility, cross-correlation between return volatility and trading volume. Moreover, by using agent-based simulations, the paper also provides a rigorous explanation from the viewpoint of a lack of market liquidity to the size effect, which means that small-sized stocks enjoy excess returns compared to large-sized stocks.

  5. Agent based modeling in tactical wargaming

    NASA Astrophysics Data System (ADS)

    James, Alex; Hanratty, Timothy P.; Tuttle, Daniel C.; Coles, John B.

    2016-05-01

    Army staffs at division, brigade, and battalion levels often plan for contingency operations. As such, analysts consider the impact and potential consequences of actions taken. The Army Military Decision-Making Process (MDMP) dictates identification and evaluation of possible enemy courses of action; however, non-state actors often do not exhibit the same level and consistency of planned actions that the MDMP was originally designed to anticipate. The fourth MDMP step is a particular challenge, wargaming courses of action within the context of complex social-cultural behaviors. Agent-based Modeling (ABM) and its resulting emergent behavior is a potential solution to model terrain in terms of the human domain and improve the results and rigor of the traditional wargaming process.

  6. Computational Simulation of Explosively Generated Pulsed Power Devices

    DTIC Science & Technology

    2013-03-21

    COMPUTATIONAL SIMULATION OF EXPLOSIVELY GENERATED PULSED POWER DEVICES THESIS Mollie C. Drumm, Captain, USAF AFIT-ENY-13-M-11 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENY-13-M-11 COMPUTATIONAL SIMULATION OF EXPLOSIVELY GENERATED PULSED POWER DEVICES THESIS Presented to the...OF EXPLOSIVELY GENERATED PULSED POWER DEVICES Mollie C. Drumm, BS Captain, USAF Approved: Dr. Robert B. Greendyke (Chairman) Date Capt. David Liu

  7. Computer Simulations of Canada’s RADARSAT2 GMTI

    DTIC Science & Technology

    2000-10-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP010837 TITLE: Computer Simulations of Canada’s RADARSAT2 GMTI...ADPO10842 UNCLASSIFIED 45-1 Computer Simulations of Canada’s RADARSAT2 GMTI Shen Chiu and Chuck Livingstone Space Systems and Technology Section, Defence...Associates Ltd. 13800 Commerce Parkway, Richmond, B.C., Canada V6V 2J3 Abstract The detection probability and the estimation accuracy Canada’s RADARSAT2

  8. GATE Monte Carlo simulation in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  9. Computer simulation of gamma-ray spectra from semiconductor detectors

    NASA Astrophysics Data System (ADS)

    Lund, Jim C.; Olschner, Fred; Shah, Kanai S.

    1992-12-01

    Traditionally, researchers developing improved gamma ray detectors have used analytical techniques or, rarely, computer simulations to predict the performance of new detectors. However, with the advent of inexpensive personal computers, it is now possible for virtually all detector researchers to perform some form of numerical computation to predict detector performance. Although general purpose code systems for semiconductor detector performance do not yet exist, it is possible to perform many useful calculations using commercially available, general purpose numerical software packages (such as `spreadsheet' programs intended for business use). With a knowledge of the rudimentary mechanics of detector simulation most researchers, including those with no programming skills, can effectively use numerical simulation methods to predict gamma ray detector performance. In this paper we discuss the details of the numerical simulation of gamma ray detectors with the hope of communicating the simplicity and effectiveness of these methods. In particular, we discuss the steps involved in simulating the pulse height spectrum produced by a semiconductor detector.

  10. Computer Simulation of Classic Studies in Psychology.

    ERIC Educational Resources Information Center

    Bradley, Drake R.

    This paper describes DATASIM, a comprehensive software package which generates simulated data for actual or hypothetical research designs. DATASIM is primarily intended for use in statistics and research methods courses, where it is used to generate "individualized" datasets for students to analyze, and later to correct their answers.…

  11. Bodies Falling with Air Resistance: Computer Simulation.

    ERIC Educational Resources Information Center

    Vest, Floyd

    1982-01-01

    Two models are presented. The first assumes that air resistance is proportional to the velocity of the falling body. The second assumes that air resistance is proportional to the square of the velocity. A program written in BASIC that simulates the second model is presented. (MP)

  12. Advanced Simulation and Computing Business Plan

    SciTech Connect

    Rummel, E.

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  13. The Forward Observer Personal Computer Simulator (FOPCSIM)

    DTIC Science & Technology

    2002-09-01

    Environment (DVTE) (CD-ROM). Produced by Andy Jackson through the Combat Visual Information Center, Marine Corps Base, Quantico, Virginia. 19 Dylan ...part of VIRTE’s forward observer training simulation. 20 LCDR Dylan Schmorrow (USN), Virtual...load the conversion data. There are software applications available to rapidly generate terrain from satellite images such as the Evans and

  14. Quantum chemistry simulation on quantum computers: theories and experiments.

    PubMed

    Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng

    2012-07-14

    It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.

  15. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  16. Monte Carlo simulations on SIMD computer architectures

    SciTech Connect

    Burmester, C.P.; Gronsky, R.; Wille, L.T.

    1992-03-01

    Algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SMM) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carlo updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures.

  17. Computer Simulation of the Beating Human Heart

    NASA Astrophysics Data System (ADS)

    Peskin, Charles S.; McQueen, David M.

    2001-06-01

    The mechanical function of the human heart couples together the fluid mechanics of blood and the soft tissue mechanics of the muscular heart walls and flexible heart valve leaflets. We discuss a unified mathematical formulation of this problem in which the soft tissue looks like a specialized part of the fluid in which additional forces are applied. This leads to a computational scheme known as the Immersed Boundary (IB) method for solving the coupled equations of motion of the whole system. The IB method is used to construct a three-dimensional Virtual Heart, including representations of all four chambers of the heart and all four valves, in addition to the large arteries and veins that connect the heart to the rest of the circulation. The chambers, valves, and vessels are all modeled as collections of elastic (and where appropriate, actively contractile) fibers immersed in viscous incompressible fluid. Results are shown as a computer-generated video animation of the beating heart.

  18. COFLO: A Computer Aid for Teaching Ecological Simulation.

    ERIC Educational Resources Information Center

    Le vow, Roy B.

    A computer-assisted course was designed to provide students with an understanding of modeling and simulation techniques in quantitiative ecology. It deals with continuous systems and has two segments. One develops mathematical and computer tools, beginning with abstract systems and their relation to physical systems. Modeling principles are next…

  19. Application Of Computer Simulation To The Entertainment Industry

    NASA Astrophysics Data System (ADS)

    Mittelman, Phillip S.

    1983-10-01

    Images generated by computer have started to appear in feature films (TRON, Star Trek II), in television commercials and in animated films. Of particular interest is the use of computer generated imagery which simulates the images which a real camera might have made if the imaged objects had been real.

  20. Use of Computer Simulations in Microbial and Molecular Genetics.

    ERIC Educational Resources Information Center

    Wood, Peter

    1984-01-01

    Describes five computer programs: four simulations of genetic and physical mapping experiments and one interactive learning program on the genetic coding mechanism. The programs were originally written in BASIC for the VAX-11/750 V.3. mainframe computer and have been translated into Applesoft BASIC for Apple IIe microcomputers. (JN)

  1. Evaluation of a Computer Simulation in a Therapeutics Case Discussion.

    ERIC Educational Resources Information Center

    Kinkade, Raenel E.; And Others

    1995-01-01

    A computer program was used to simulate a case presentation in pharmacotherapeutics. Students (n=24) used their knowledge of the disease (glaucoma) and various topical agents on the computer program's formulary to "treat" the patient. Comparison of results with a control group found the method as effective as traditional case…

  2. Cardiovascular Physiology Teaching: Computer Simulations vs. Animal Demonstrations.

    ERIC Educational Resources Information Center

    Samsel, Richard W.; And Others

    1994-01-01

    At the introductory level, the computer provides an effective alternative to using animals for laboratory teaching. Computer software can simulate the operation of multiple organ systems. Advantages of software include alteration of variables that are not easily changed in vivo, repeated interventions, and cost-effective hands-on student access.…

  3. Teaching Macroeconomics with a Computer Simulation. Final Report.

    ERIC Educational Resources Information Center

    Dolbear, F. Trenery, Jr.

    The study of macroeconomics--the determination and control of aggregative variables such as gross national product, unemployment and inflation--may be facilitated by the use of a computer simulation policy game. An aggregative model of the economy was constructed and programed for a computer and (hypothetical) historical data were generated. The…

  4. Coached, Interactive Computer Simulations: A New Technology for Training.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

  5. Computational Aerothermodynamic Simulation Issues on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; White, Jeffery A.

    2004-01-01

    The synthesis of physical models for gas chemistry and turbulence from the structured grid codes LAURA and VULCAN into the unstructured grid code FUN3D is described. A directionally Symmetric, Total Variation Diminishing (STVD) algorithm and an entropy fix (eigenvalue limiter) keyed to local cell Reynolds number are introduced to improve solution quality for hypersonic aeroheating applications. A simple grid-adaptation procedure is incorporated within the flow solver. Simulations of flow over an ellipsoid (perfect gas, inviscid), Shuttle Orbiter (viscous, chemical nonequilibrium) and comparisons to the structured grid solvers LAURA (cylinder, Shuttle Orbiter) and VULCAN (flat plate) are presented to show current capabilities. The quality of heating in 3D stagnation regions is very sensitive to algorithm options in general, high aspect ratio tetrahedral elements complicate the simulation of high Reynolds number, viscous flow as compared to locally structured meshes aligned with the flow.

  6. Phase diagram of silica from computer simulation

    NASA Astrophysics Data System (ADS)

    Saika-Voivod, Ivan; Sciortino, Francesco; Grande, Tor; Poole, Peter H.

    2004-12-01

    We evaluate the phase diagram of the “BKS” potential [van Beest, Kramer, and van Santen, Phys. Rev. Lett. 64, 1955 (1990)], a model of silica widely used in molecular dynamics (MD) simulations. We conduct MD simulations of the liquid, and three crystals ( β -quartz, coesite, and stishovite) over wide ranges of temperature and density, and evaluate the total Gibbs free energy of each phase. The phase boundaries are determined by the intersection of these free energy surfaces. Not unexpectedly for a classical pair potential, our results reveal quantitative discrepancies between the locations of the BKS and real silica phase boundaries. At the same time, we find that the topology of the real phase diagram is reproduced, confirming that the BKS model provides a satisfactory qualitative description of a silicalike material. We also compare the phase boundaries with the locations of liquid-state thermodynamic anomalies identified in previous studies of the BKS model.

  7. An Agent-Based Model for Studying Child Maltreatment and Child Maltreatment Prevention

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard W.

    This paper presents an agent-based model that simulates the dynamics of child maltreatment and child maltreatment prevention. The developed model follows the principles of complex systems science and explicitly models a community and its families with multi-level factors and interconnections across the social ecology. This makes it possible to experiment how different factors and prevention strategies can affect the rate of child maltreatment. We present the background of this work and give an overview of the agent-based model and show some simulation results.

  8. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.

    1981-01-01

    A molecular dynamics technique based upon Lennard-Jones type pair interactions is used to investigate time-dependent as well as equilibrium properties. The case study deals with systems containing Si and O atoms. In this case a more involved potential energy function (PEF) is employed and the system is simulated via a Monte-Carlo procedure. This furnishes the equilibrium properties of the system at its interfaces and surfaces as well as in the bulk.

  9. A Computer Simulation of Braitenberg Vehicles

    DTIC Science & Technology

    1991-03-01

    and that have the ability to adapt their behavior , using a learning algorithm developed by Teuvo Kohonen. The vehicle designer is free to select...learning algorithm, adapting behavior to improve food finding-performance. The initial evaluations failed to provide convincing proof that the simple...m m m | m | l | m i Preface The purpose of this effort was to simulate simple, biological learning behavior using an artificial neural network to

  10. Computer Simulation of Shipboard Electrical Distribution Systems

    DTIC Science & Technology

    1989-06-01

    variable. If used properly, the Euler Backward method for integrating differential equations approaches the same solution. Fast modes can also be...synchronous machines as well as other elements of a power network. EMTP handles stiff systems by using the Euler Backward method for integration. In general...simulations - 29 - however, there are three methods that work well. The f’irst is the Euler Forward method which is considered an explicit technique since it

  11. Computational Simulation of High Energy Density Plasmas

    DTIC Science & Technology

    2009-10-30

    flow. NumerEx used MACH2 to simulate the flow using compressible, inviscid hydrodynamics with the SESAME equations of state . The depth of the...Figure 1 shows the liner state versus the radius of a collapsing 10 cm tall lithium liner driven by an RLC circuit model of Shiva Star. This work...the coaxial gun section, and Figure 4 shows the physical state of the plasma just prior to pinch. Figure 5 shows neutron yield reaching 1014 in this

  12. Computer simulation of a geomagnetic substorm

    NASA Technical Reports Server (NTRS)

    Lyon, J. G.; Brecht, S. H.; Huba, J. D.; Fedder, J. A.; Palmadesso, P. J.

    1981-01-01

    A global two-dimensional simulation of a substormlike process occurring in earth's magnetosphere is presented. The results are consistent with an empirical substorm model - the neutral-line model. Specifically, the introduction of a southward interplanetary magnetic field forms an open magnetosphere. Subsequently, a substorm neutral line forms at about 15 earth radii or closer in the magnetotail, and plasma sheet thinning and plasma acceleration occur. Eventually the substorm neutral line moves tailward toward its presubstorm position.

  13. Computer simulation of the NASA water vapor electrolysis reactor

    NASA Technical Reports Server (NTRS)

    Bloom, A. M.

    1974-01-01

    The water vapor electrolysis (WVE) reactor is a spacecraft waste reclamation system for extended-mission manned spacecraft. The WVE reactor's raw material is water, its product oxygen. A computer simulation of the WVE operational processes provided the data required for an optimal design of the WVE unit. The simulation process was implemented with the aid of a FORTRAN IV routine.

  14. Effectiveness of an Endodontic Diagnosis Computer Simulation Program.

    ERIC Educational Resources Information Center

    Fouad, Ashraf F.; Burleson, Joseph A.

    1997-01-01

    Effectiveness of a computer simulation to teach endodontic diagnosis was assessed using three groups (n=34,32,24) of dental students. All were lectured on diagnosis, pathology, and radiographic interpretation. One group then used the simulation, another had a seminar on the same material, and the third group had no further instruction. Results…

  15. The Design, Development, and Evaluation of an Evaluative Computer Simulation.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper discusses evaluation design considerations for a computer based evaluation simulation developed at the University of Iowa College of Medicine in Cardiology to assess the diagnostic skills of primary care physicians and medical students. The simulation developed allows for the assessment of diagnostic skills of physicians in the…

  16. Computer Simulation of Incomplete-Data Interpretation Exercise.

    ERIC Educational Resources Information Center

    Robertson, Douglas Frederick

    1987-01-01

    Described is a computer simulation that was used to help general education students enrolled in a large introductory geology course. The purpose of the simulation is to learn to interpret incomplete data. Students design a plan to collect bathymetric data for an area of the ocean. Procedures used by the students and instructor are included.…

  17. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

    ERIC Educational Resources Information Center

    Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

    2012-01-01

    Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

  18. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  19. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  20. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    ERIC Educational Resources Information Center

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…

  1. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    ERIC Educational Resources Information Center

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  2. Computer Simulation of Auxiliary Power Systems.

    DTIC Science & Technology

    1980-03-01

    reverse side if necessary and iden~ffy by block number) gas turbine engine turbine engine computer programs auxiliary power unit aircraft engine starter ,i...printed to that effect . d. Turbines There are three choices for the turbine configuration, see Figure 2: 1) a one-stage turbine, 2) a two-stage turbine...07000 MAIN CO!RBUSTION EFF = .99500 DESIGN FUEL FLOW (LB/IHR) 150.00 MAIN COMB FUEL HEATING VALUE AT T4 FOR JP4 * 18400. COMB DISCHARGE TEMP

  3. MIA computer simulation test results report. [space shuttle avionics

    NASA Technical Reports Server (NTRS)

    Unger, G. E.

    1974-01-01

    Results of the first noise susceptibility computer simulation tests of the complete MIA receiver analytical model are presented. Computer simulation tests were conducted with both Gaussian and pulse noise inputs. The results of the Gaussian noise tests were compared to results predicted previously and were found to be in substantial agreement. The results of the pulse noise tests will be compared to the results of planned analogous tests in the Data Bus Evaluation Laboratory at a later time. The MIA computer model is considered to be fully operational at this time.

  4. Computer simulations of granular materials: the effects of mesoscopic forces

    NASA Astrophysics Data System (ADS)

    Kohring, G. A.

    1994-12-01

    The problem of the relatively small angles of repose reported by computer simulations of granular materials is discussed. It is shown that this problem can be partially understood as resulting from mesoscopic forces which are commonly neglected in the simulations. After including mesoscopic forces, characterized by the easily measurable surface energy, 2D computer simulations indicate that the angle of repose should increase as the size of the granular grains decreases, an effect not seen without mesoscopic forces. The exact magnitude of this effect depends upon the value of the surface energy and the coordination number of the granular pile.

  5. Computer Models Simulate Fine Particle Dispersion

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  6. Use of computer graphics simulation for teaching of flexible sigmoidoscopy.

    PubMed

    Baillie, J; Jowell, P; Evangelou, H; Bickel, W; Cotton, P

    1991-05-01

    The concept of simulation training in endoscopy is now well-established. The systems currently under development employ either computer graphics simulation or interactive video technology; each has its strengths and weaknesses. A flexible sigmoidoscopy training device has been designed which uses graphic routines--such as object oriented programming and double buffering--in entirely new ways. These programming techniques compensate for the limitations of currently available desk-top microcomputers. By boosting existing computer 'horsepower' with next generation coprocessors and sophisticated graphics tools such as intensity interpolation (Gouraud shading), the realism of computer simulation of flexible sigmoidoscopy is being greatly enhanced. The computer program has teaching and scoring capabilities, making it a truly interactive system. Use has been made of this ability to record, grade and store each trainee encounter in computer memory as part of a multi-center, prospective trial of simulation training being conducted currently in the USA. A new input device, a dummy endoscope, has been designed that allows application of variable resistance to the insertion tube. This greatly enhances tactile feedback, such as resistance during looping. If carefully designed trials show that computer simulation is an attractive and effective training tool, it is expected that this technology will evolve rapidly and be made widely available to trainee endoscopists.

  7. Micromechanics-Based Computational Simulation of Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Mutal, Subodh K.; Duff, Dennis L. (Technical Monitor)

    2003-01-01

    Advanced high-temperature Ceramic Matrix Composites (CMC) hold an enormous potential for use in aerospace propulsion system components and certain land-based applications. However, being relatively new materials, a reliable design properties database of sufficient fidelity does not yet exist. To characterize these materials solely by testing is cost and time prohibitive. Computational simulation then becomes very useful to limit the experimental effort and reduce the design cycle time, Authors have been involved for over a decade in developing micromechanics- based computational simulation techniques (computer codes) to simulate all aspects of CMC behavior including quantification of scatter that these materials exhibit. A brief summary/capability of these computer codes with typical examples along with their use in design/analysis of certain structural components is the subject matter of this presentation.

  8. Computational challenges in modeling and simulating living matter

    NASA Astrophysics Data System (ADS)

    Sena, Alexandre C.; Silva, Dilson; Marzulo, Leandro A. J.; de Castro, Maria Clicia Stelling

    2016-12-01

    Computational modeling has been successfully used to help scientists understand physical and biological phenomena. Recent technological advances allowthe simulation of larger systems, with greater accuracy. However, devising those systems requires new approaches and novel architectures, such as the use of parallel programming, so that the application can run in the new high performance environments, which are often computer clusters composed of different computation devices, as traditional CPUs, GPGPUs, Xeon Phis and even FPGAs. It is expected that scientists take advantage of the increasing computational power to model and simulate more complex structures and even merge different models into larger and more extensive ones. This paper aims at discussing the challenges of using those devices to simulate such complex systems.

  9. Positive Wigner functions render classical simulation of quantum computation efficient.

    PubMed

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  10. A heterogeneous computing environment for simulating astrophysical fluid flows

    NASA Technical Reports Server (NTRS)

    Cazes, J.

    1994-01-01

    In the Concurrent Computing Laboratory in the Department of Physics and Astronomy at Louisiana State University we have constructed a heterogeneous computing environment that permits us to routinely simulate complicated three-dimensional fluid flows and to readily visualize the results of each simulation via three-dimensional animation sequences. An 8192-node MasPar MP-1 computer with 0.5 GBytes of RAM provides 250 MFlops of execution speed for our fluid flow simulations. Utilizing the parallel virtual machine (PVM) language, at periodic intervals data is automatically transferred from the MP-1 to a cluster of workstations where individual three-dimensional images are rendered for inclusion in a single animation sequence. Work is underway to replace executions on the MP-1 with simulations performed on the 512-node CM-5 at NCSA and to simultaneously gain access to more potent volume rendering workstations.

  11. Computational methods for coupling microstructural and micromechanical materials response simulations

    SciTech Connect

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  12. Computer simulation of vasectomy for wolf control

    USGS Publications Warehouse

    Haight, R.G.; Mech, L.D.

    1997-01-01

    Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

  13. Climate Shocks and Migration: An Agent-Based Modeling Approach.

    PubMed

    Entwisle, Barbara; Williams, Nathalie E; Verdery, Ashton M; Rindfuss, Ronald R; Walsh, Stephen J; Malanson, George P; Mucha, Peter J; Frizzelle, Brian G; McDaniel, Philip M; Yao, Xiaozheng; Heumann, Benjamin W; Prasartkul, Pramote; Sawangdee, Yothin; Jampaklay, Aree

    2016-09-01

    This is a study of migration responses to climate shocks. We construct an agent-based model that incorporates dynamic linkages between demographic behaviors, such as migration, marriage, and births, and agriculture and land use, which depend on rainfall patterns. The rules and parameterization of our model are empirically derived from qualitative and quantitative analyses of a well-studied demographic field site, Nang Rong district, Northeast Thailand. With this model, we simulate patterns of migration under four weather regimes in a rice economy: 1) a reference, 'normal' scenario; 2) seven years of unusually wet weather; 3) seven years of unusually dry weather; and 4) seven years of extremely variable weather. Results show relatively small impacts on migration. Experiments with the model show that existing high migration rates and strong selection factors, which are unaffected by climate change, are likely responsible for the weak migration response.

  14. Validating agent based models through virtual worlds.

    SciTech Connect

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  15. Agent-based modeling supporting the migration of registry systems to grid based architectures.

    PubMed

    Cryer, Martin E; Frey, Lewis

    2009-03-01

    With the increasing age and cost of operation of the existing NCI SEER platform core technologies, such essential resources in the fight against cancer as these will eventually have to be migrated to Grid based systems. In order to model this migration, a simulation is proposed based upon an agent modeling technology. This modeling technique allows for simulation of complex and distributed services provided by a large scale Grid computing platform such as the caBIG(™) project's caGRID. In order to investigate such a migration to a Grid based platform technology, this paper proposes using agent-based modeling simulations to predict the performance of current and Grid configurations of the NCI SEER system integrated with the existing translational opportunities afforded by caGRID. The model illustrates how the use of Grid technology can potentially improve system response time as systems under test are scaled. In modeling SEER nodes accessing multiple registry silos, we show that the performance of SEER applications re-implemented in a Grid native manner exhibits a nearly constant user response time with increasing numbers of distributed registry silos, compared with the current application architecture which exhibits a linear increase in response time for increasing numbers of silos.

  16. Computation simulation of the nonlinear response of suspension bridges

    SciTech Connect

    McCallen, D.B.; Astaneh-Asl, A.

    1997-10-01

    Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

  17. The computer scene generation for star simulator hardware-in-the-loop simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Yu, Hong; Du, Huijie; Lei, Jie

    2011-08-01

    The star sensor simulation system is used to test the star sensor performance on the ground, which is designed for star identification and spacecraft attitude determination of the spacecraft. The computer star scene based on the astronomical star chat is generated for hardware-in-the-loop simulation of the star sensor simulation system using by OpenGL.

  18. Computational Simulations and the Scientific Method

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  19. Osmosis : a molecular dynamics computer simulation study

    NASA Astrophysics Data System (ADS)

    Lion, Thomas

    Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..

  20. Agent-based modeling in ecological economics.

    PubMed

    Heckbert, Scott; Baynes, Tim; Reeson, Andrew

    2010-01-01

    Interconnected social and environmental systems are the domain of ecological economics, and models can be used to explore feedbacks and adaptations inherent in these systems. Agent-based modeling (ABM) represents autonomous entities, each with dynamic behavior and heterogeneous characteristics. Agents interact with each other and their environment, resulting in emergent outcomes at the macroscale that can be used to quantitatively analyze complex systems. ABM is contributing to research questions in ecological economics in the areas of natural resource management and land-use change, urban systems modeling, market dynamics, changes in consumer attitudes, innovation, and diffusion of technology and management practices, commons dilemmas and self-governance, and psychological aspects to human decision making and behavior change. Frontiers for ABM research in ecological economics involve advancing the empirical calibration and validation of models through mixed methods, including surveys, interviews, participatory modeling, and, notably, experimental economics to test specific decision-making hypotheses. Linking ABM with other modeling techniques at the level of emergent properties will further advance efforts to understand dynamics of social-environmental systems.

  1. Agent Based Model of Livestock Movements

    NASA Astrophysics Data System (ADS)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  2. Traffic simulations on parallel computers using domain decomposition techniques

    SciTech Connect

    Hanebutte, U.R.; Tentner, A.M.

    1995-12-31

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

  3. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use.

  4. Mapping an expanding territory: computer simulations in evolutionary biology.

    PubMed

    Huneman, Philippe

    2014-08-01

    The pervasive use of computer simulations in the sciences brings novel epistemological issues discussed in the philosophy of science literature since about a decade. Evolutionary biology strongly relies on such simulations, and in relation to it there exists a research program (Artificial Life) that mainly studies simulations themselves. This paper addresses the specificity of computer simulations in evolutionary biology, in the context (described in Sect. 1) of a set of questions about their scope as explanations, the nature of validation processes and the relation between simulations and true experiments or mathematical models. After making distinctions, especially between a weak use where simulations test hypotheses about the world, and a strong use where they allow one to explore sets of evolutionary dynamics not necessarily extant in our world, I argue in Sect. 2 that (weak) simulations are likely to represent in virtue of the fact that they instantiate specific features of causal processes that may be isomorphic to features of some causal processes in the world, though the latter are always intertwined with a myriad of different processes and hence unlikely to be directly manipulated and studied. I therefore argue that these simulations are merely able to provide candidate explanations for real patterns. Section 3 ends up by placing strong and weak simulations in Levins' triangle, that conceives of simulations as devices trying to fulfil one or two among three incompatible epistemic values (precision, realism, genericity).

  5. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  6. Computer simulation tests of optimized neutron powder diffractometer configurations

    NASA Astrophysics Data System (ADS)

    Cussen, L. D.; Lieutenant, K.

    2016-06-01

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  7. Computer Simulation of Metallo-Supramolecular Networks

    NASA Astrophysics Data System (ADS)

    Wang, Shihu; Chen, Chun-Chung; Dormidontova, Elena

    2009-03-01

    Using Monte Carlo simulation we studied formation of reversible metallo-supramolecular networks based on 3:1 ligand-metal complexes between end-functionalized oligomers and metal ions. The fraction of 1:1, 2:1 and 3:1 ligand-metal complexes was obtained and analyzed using an analytical approach as a function of oligomer concentration, c and metal-to-oligomer ratio. We found that at low concentration the maximum in the number-average molecular weight is achieved near the stoichiometric composition and it shifts to higher metal-to- oligomer ratios at larger concentrations. Predictions are made regarding the onset of network formation, which occurs in a limited range of metal-to-oligomer ratios at sufficiently large oligomer concentrations. The average molecular weight between effective crosslinks decreases with oligomer concentration and reaches its minimum at the stoichiometric composition, where the high-frequency elastic plateau modulus approaches its maximal value. At high oligomer concentrations the plateau modulus follows a c^1.8 concentration dependence, similar to recent experimental results for metallo-supramolecular networks.

  8. Computer Simulation of Glioma Growth and Morphology

    PubMed Central

    Frieboes, Hermann B.; Lowengrub, John S.; Wise, S.; Zheng, X.; Macklin, Paul; Bearer, Elaine; Cristini, Vittorio

    2007-01-01

    Despite major advances in the study of glioma, the quantitative links between intra-tumor molecular/cellular properties, clinically observable properties such as morphology, and critical tumor behaviors such as growth and invasiveness remain unclear, hampering more effective coupling of tumor physical characteristics with implications for prognosis and therapy. Although molecular biology, histopathology, and radiological imaging are employed in this endeavor, studies are severely challenged by the multitude of different physical scales involved in tumor growth, i.e., from molecular nanoscale to cell microscale and finally to tissue centimeter scale. Consequently, it is often difficult to determine the underlying dynamics across dimensions. New techniques are needed to tackle these issues. Here, we address this multi-scalar problem by employing a novel predictive three-dimensional mathematical and computational model based on first-principle equations (conservation laws of physics) that describe mathematically the diffusion of cell substrates and other processes determining tumor mass growth and invasion. The model uses conserved variables to represent known determinants of glioma behavior, e.g., cell density and oxygen concentration, as well as biological functional relationships and parameters linking phenomena at different scales whose specific forms and values are hypothesized and calculated based on in-vitro and in-vivo experiments and from histopathology of tissue specimens from human gliomas. This model enables correlation of glioma morphology to tumor growth by quantifying interdependence of tumor mass on the microenvironment (e.g., hypoxia, tissue disruption) and on the cellular phenotypes (e.g., mitosis and apoptosis rates, cell adhesion strength). Once functional relationships between variables and associated parameter values have been informed, e.g. from histopathology or intra-operative analysis, this model can be used for disease diagnosis

  9. Numerical Problems and Agent-Based Models for a Mass Transfer Course

    ERIC Educational Resources Information Center

    Murthi, Manohar; Shea, Lonnie D.; Snurr, Randall Q.

    2009-01-01

    Problems requiring numerical solutions of differential equations or the use of agent-based modeling are presented for use in a course on mass transfer. These problems were solved using the popular technical computing language MATLABTM. Students were introduced to MATLAB via a problem with an analytical solution. A more complex problem to which no…

  10. Agent-Based Learning Environments as a Research Tool for Investigating Teaching and Learning.

    ERIC Educational Resources Information Center

    Baylor, Amy L.

    2002-01-01

    Discusses intelligent learning environments for computer-based learning, such as agent-based learning environments, and their advantages over human-based instruction. Considers the effects of multiple agents; agents and research design; the use of Multiple Intelligent Mentors Instructing Collaboratively (MIMIC) for instructional design for…

  11. Permutations of Control: Cognitive Considerations for Agent-Based Learning Environments.

    ERIC Educational Resources Information Center

    Baylor, Amy L.

    2001-01-01

    Discussion of intelligent agents and their use in computer learning environments focuses on cognitive considerations. Presents four dimension of control that should be considered in designing agent-based learning environments: learner control, from constructivist to instructivist; feedback; relationship of learner to agent; and learner confidence…

  12. Computer simulations of athermal and glassy systems

    NASA Astrophysics Data System (ADS)

    Xu, Ning

    2005-12-01

    We performed extensive molecular dynamics simulations to better understand athermal and glassy systems near jamming transitions. We focused on four related projects. In the first project, we decomposed the probability distribution P(φ) of finding a collectively jammed state at packing fraction φ into two distinct contributions: the density of CJ states rho(φ) and their basins of attraction beta(φ). In bidisperse systems, it is likely that rho(φ) controls the shape of P(φ) in the large system size limit, and thus the most likely random jammed state may be used as a protocol independent definition of random close packing in this system. In the second project, we measured the yield stress in two different ensembles: constant shear rate and constant stress. The yield stress measured in the constant stress ensemble is larger than that measured in the constant shear rate ensemble, however, the difference between these two measurements decreases with increasing system size. In the third project, we investigated under what circumstances nonlinear velocity profiles form in frictionless granular systems undergoing boundary driven planar shear flow. Nonlinear velocity profiles occur at short times, but evolve into linear profiles at long times. Nonlinear velocity profiles can be stabilized by vibrating these systems. The velocity profile can become highly localized when the shear stress of the system is below the constant force yield stress, provided that the granular temperature difference across the system is sufficiently large. In the fourth project, we measured the effective temperature defined from equilibrium fluctuation-dissipation relations in athermal and glassy systems sheared at constant pressure. We found that the effective temperature is strongly controlled by pressure in the slowly sheared regime. Thus, this effective temperature and pressure are not independent variables in this regime.

  13. A computer simulation of aircraft evacuation with fire

    NASA Technical Reports Server (NTRS)

    Middleton, V. E.

    1983-01-01

    A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

  14. Computational simulation of drug delivery at molecular level.

    PubMed

    Li, Youyong; Hou, Tingjun

    2010-01-01

    The field of drug delivery is advancing rapidly. By controlling the precise level and/or location of a given drug in the body, side effects are reduced, doses are lowered, and new therapies are possible. Nonetheless, substantial challenges remain for delivering specific drugs into specific cells. Computational methods to predict the binding and dynamics between drug molecule and its carrier are increasingly desirable to minimize the investment in drug design and development. Significant progress in computational simulation is making it possible to understand the mechanism of drug delivery. This review summarizes the computational methods and progress of four categories of drug delivery systems: dendrimers, polymer micelle, liposome and carbon nanotubes. Computational simulations are particularly valuable in designing better drug carriers and addressing issues that are difficult to be explored by laboratory experiments, such as diffusion, dynamics, etc.

  15. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    SciTech Connect

    McCoy, Michel; Archer, Bill; Hendrickson, Bruce; Wade, Doug; Hoang, Thuc

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  16. KU-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Griffin, J. W.

    1980-01-01

    The preparation of a real time computer simulation model of the KU band rendezvous radar to be integrated into the shuttle mission simulator (SMS), the shuttle engineering simulator (SES), and the shuttle avionics integration laboratory (SAIL) simulator is described. To meet crew training requirements a radar tracking performance model, and a target modeling method were developed. The parent simulation/radar simulation interface requirements, and the method selected to model target scattering properties, including an application of this method to the SPAS spacecraft are described. The radar search and acquisition mode performance model and the radar track mode signal processor model are examined and analyzed. The angle, angle rate, range, and range rate tracking loops are also discussed.

  17. Towards accurate quantum simulations of large systems with small computers

    NASA Astrophysics Data System (ADS)

    Yang, Yonggang

    2017-01-01

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

  18. Towards accurate quantum simulations of large systems with small computers.

    PubMed

    Yang, Yonggang

    2017-01-24

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems.

  19. Towards accurate quantum simulations of large systems with small computers

    PubMed Central

    Yang, Yonggang

    2017-01-01

    Numerical simulations are important for many systems. In particular, various standard computer programs have been developed for solving the quantum Schrödinger equations. However, the accuracy of these calculations is limited by computer capabilities. In this work, an iterative method is introduced to enhance the accuracy of these numerical calculations, which is otherwise prohibitive by conventional methods. The method is easily implementable and general for many systems. PMID:28117366

  20. Energy Efficient Biomolecular Simulations with FPGA-based Reconfigurable Computing

    SciTech Connect

    Hampton, Scott S; Agarwal, Pratul K

    2010-05-01

    Reconfigurable computing (RC) is being investigated as a hardware solution for improving time-to-solution for biomolecular simulations. A number of popular molecular dynamics (MD) codes are used to study various aspects of biomolecules. These codes are now capable of simulating nanosecond time-scale trajectories per day on conventional microprocessor-based hardware, but biomolecular processes often occur at the microsecond time-scale or longer. A wide gap exists between the desired and achievable simulation capability; therefore, there is considerable interest in alternative algorithms and hardware for improving the time-to-solution of MD codes. The fine-grain parallelism provided by Field Programmable Gate Arrays (FPGA) combined with their low power consumption make them an attractive solution for improving the performance of MD simulations. In this work, we use an FPGA-based coprocessor to accelerate the compute-intensive calculations of LAMMPS, a popular MD code, achieving up to 5.5 fold speed-up on the non-bonded force computations of the particle mesh Ewald method and up to 2.2 fold speed-up in overall time-to-solution, and potentially an increase by a factor of 9 in power-performance efficiencies for the pair-wise computations. The results presented here provide an example of the multi-faceted benefits to an application in a heterogeneous computing environment.

  1. SSBN Tactical Security Exercise Simulator and Tactical Development Computer Program

    DTIC Science & Technology

    1990-05-08

    physical dynamics. inc. *RES OPERA TIONSRE-R090 I IL’) FINAL REPORTU (N SSBN TACTICAL SECURITY EXERCISE SIMULATOR AND TACTICAL DEVELOPMENT COMPUTER...PROGRAM I CONTRACT #No0014-87-C-0063 I 8 MAY 1990 -A I SUBMITTED BY: PHYSICAL DYNAMICS, INC. RES OPERATIONS :. P. 0. BOX 9505 ARLINGTON, VA 22209 U...C A I RES-FR-009-90 I I I * FINAL REPORT I SSBN TACTICAL SECURITY EXERCISE SIMULATOR AND TACTICAL DEVELOPMENT COMPUTER PROGRAM I "NTRACT

  2. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  3. Method for simulating paint mixing on computer monitors

    NASA Astrophysics Data System (ADS)

    Carabott, Ferdinand; Lewis, Garth; Piehl, Simon

    2002-06-01

    Computer programs like Adobe Photoshop can generate a mixture of two 'computer' colors by using the Gradient control. However, the resulting colors diverge from the equivalent paint mixtures in both hue and value. This study examines why programs like Photoshop are unable to simulate paint or pigment mixtures, and offers a solution using Photoshops existing tools. The article discusses how a library of colors, simulating paint mixtures, is created from 13 artists' colors. The mixtures can be imported into Photoshop as a color swatch palette of 1248 colors and as 78 continuous or stepped gradient files, all accessed in a new software package, Chromafile.

  4. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    PubMed

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies.

  5. Improving Agent Based Models and Validation through Data Fusion

    PubMed Central

    Laskowski, Marek; Demianyk, Bryan C.P.; Friesen, Marcia R.; McLeod, Robert D.; Mukhi, Shamir N.

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level. PMID:23569606

  6. A Distributed Platform for Global-Scale Agent-Based Models of Disease Transmission

    PubMed Central

    Parker, Jon; Epstein, Joshua M.

    2013-01-01

    The Global-Scale Agent Model (GSAM) is presented. The GSAM is a high-performance distributed platform for agent-based epidemic modeling capable of simulating a disease outbreak in a population of several billion agents. It is unprecedented in its scale, its speed, and its use of Java. Solutions to multiple challenges inherent in distributing massive agent-based models are presented. Communication, synchronization, and memory usage are among the topics covered in detail. The memory usage discussion is Java specific. However, the communication and synchronization discussions apply broadly. We provide benchmarks illustrating the GSAM’s speed and scalability. PMID:24465120

  7. [Computer simulated images of radiopharmaceutical distributions in anthropomorphic phantoms

    SciTech Connect

    Not Available

    1991-05-17

    We have constructed an anatomically correct human geometry, which can be used to store radioisotope concentrations in 51 various internal organs. Each organ is associated with an index number which references to its attenuating characteristics (composition and density). The initial development of Computer Simulated Images of Radiopharmaceuticals in Anthropomorphic Phantoms (CSIRDAP) over the first 3 years has been very successful. All components of the simulation have been coded, made operational and debugged.

  8. LAWS simulation: Sampling strategies and wind computation algorithms

    NASA Technical Reports Server (NTRS)

    Emmitt, G. D. A.; Wood, S. A.; Houston, S. H.

    1989-01-01

    In general, work has continued on developing and evaluating algorithms designed to manage the Laser Atmospheric Wind Sounder (LAWS) lidar pulses and to compute the horizontal wind vectors from the line-of-sight (LOS) measurements. These efforts fall into three categories: Improvements to the shot management and multi-pair algorithms (SMA/MPA); observing system simulation experiments; and ground-based simulations of LAWS.

  9. Computer simulations of ions in radio-frequency traps

    NASA Technical Reports Server (NTRS)

    Williams, A.; Prestage, J. D.; Maleki, L.; Djomehri, J.; Harabetian, E.

    1990-01-01

    The motion of ions in a trapped-ion frequency standard affects the stability of the standard. In order to study the motion and structures of large ion clouds in a radio-frequency (RF) trap, a computer simulation of the system that incorporates the effect of thermal excitation of the ions was developed. Results are presented from the simulation for cloud sizes up to 512 ions, emphasizing cloud structures in the low-temperature regime.

  10. Sandia Laboratories hybrid computer and motion simulator facilities

    SciTech Connect

    Curry, W. H.; French, R. E.

    1980-05-01

    Hybrid computer and motion simulator facilities at Sandia National Laboratories include an AD/FIVE-AD10-PDP11/60, an AD/FIVE-PDP11/45, an EAI7800-EAI640, an EAI580/TR48-Nova 800, and two Carco S-45OR-3/R-493A three-axis motion simulators. An EAI680 is used in the analog mode only. This report describes the current equipment.

  11. A Computer Simulation of Community Pharmacy Practice for Educational Use

    PubMed Central

    Ling, Tristan; Bereznicki, Luke; Westbury, Juanita; Chalmers, Leanne; Peterson, Gregory; Ollington, Robert

    2014-01-01

    Objective. To provide a computer-based learning method for pharmacy practice that is as effective as paper-based scenarios, but more engaging and less labor-intensive. Design. We developed a flexible and customizable computer simulation of community pharmacy. Using it, the students would be able to work through scenarios which encapsulate the entirety of a patient presentation. We compared the traditional paper-based teaching method to our computer-based approach using equivalent scenarios. The paper-based group had 2 tutors while the computer group had none. Both groups were given a prescenario and postscenario clinical knowledge quiz and survey. Assessment. Students in the computer-based group had generally greater improvements in their clinical knowledge score, and third-year students using the computer-based method also showed more improvements in history taking and counseling competencies. Third-year students also found the simulation fun and engaging. Conclusion. Our simulation of community pharmacy provided an educational experience as effective as the paper-based alternative, despite the lack of a human tutor. PMID:26056406

  12. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    PubMed

    Gorochowski, Thomas E; Matyjaszkiewicz, Antoni; Todd, Thomas; Oak, Neeraj; Kowalska, Kira; Reid, Stephen; Tsaneva-Atanasova, Krasimira T; Savery, Nigel J; Grierson, Claire S; di Bernardo, Mario

    2012-01-01

    Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI) recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher.

  13. High Performance Computing for Agent-Based Cognitive Modeling

    DTIC Science & Technology

    2011-02-25

    enterprise applications, is Common Object Request Broker Architecture ( CORBA ). CORBA provides methods for programming language and platform...independent communication between applications(" CORBA FAQ,"). CORBA works by allowing developers to specify an object using a standardized Interface...Definition Language (IDL), and mapping the IDL to data types available in each language that implements a CORBA library. This is similar in nature to

  14. Access Control for Agent-based Computing: A Distributed Approach.

    ERIC Educational Resources Information Center

    Antonopoulos, Nick; Koukoumpetsos, Kyriakos; Shafarenko, Alex

    2001-01-01

    Discusses the mobile software agent paradigm that provides a foundation for the development of high performance distributed applications and presents a simple, distributed access control architecture based on the concept of distributed, active authorization entities (lock cells), any combination of which can be referenced by an agent to provide…

  15. Agent-Based Computing in Distributed Adversarial Planning

    DTIC Science & Technology

    2010-08-09

    agents, P3 represents games with 3 agents; value of BF represents the branching factors for the agents in fixed order (each digit for one agent...and M. Wooldridge. Cooperation, knowledge, and time: Alternating-time temporal epistemic logic and its applications. Studia Logica , 75(1):125–157

  16. The advanced computational testing and simulation toolkit (ACTS)

    SciTech Connect

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  17. Hybrid agent-based model for quantitative in-silico cell-free protein synthesis.

    PubMed

    Semenchenko, Anton; Oliveira, Guilherme; Atman, A P F

    2016-12-01

    An advanced vision of the mRNA translation is presented through a hybrid modeling approach. The dynamics of the polysome formation was investigated by computer simulation that combined agent-based model and fine-grained Markov chain representation of the chemical kinetics. This approach allowed for the investigation of the polysome dynamics under non-steady-state and non-continuum conditions. The model is validated by the quantitative comparison of the simulation results and Luciferase protein production in cell-free system, as well as by testing of the hypothesis regarding the two possible mechanisms of the Edeine antibiotic. Calculation of the Hurst exponent demonstrated a relationship between the microscopic properties of amino acid elongation and the fractal dimension of the translation duration time series. The temporal properties of the amino acid elongation have indicated an anti-persistent behavior under low mRNA occupancy and evinced the appearance of long range interactions within the mRNA-ribosome system for high ribosome density. The dynamic and temporal characteristics of the polysomal system presented here can have a direct impact on the studies of the co-translation protein folding and provide a validated platform for cell-free system studies.

  18. Computer model to simulate testing at the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Mineck, Raymond E.; Owens, Lewis R., Jr.; Wahls, Richard A.; Hannon, Judith A.

    1995-01-01

    A computer model has been developed to simulate the processes involved in the operation of the National Transonic Facility (NTF), a large cryogenic wind tunnel at the Langley Research Center. The simulation was verified by comparing the simulated results with previously acquired data from three experimental wind tunnel test programs in the NTF. The comparisons suggest that the computer model simulates reasonably well the processes that determine the liquid nitrogen (LN2) consumption, electrical consumption, fan-on time, and the test time required to complete a test plan at the NTF. From these limited comparisons, it appears that the results from the simulation model are generally within about 10 percent of the actual NTF test results. The use of actual data acquisition times in the simulation produced better estimates of the LN2 usage, as expected. Additional comparisons are needed to refine the model constants. The model will typically produce optimistic results since the times and rates included in the model are typically the optimum values. Any deviation from the optimum values will lead to longer times or increased LN2 and electrical consumption for the proposed test plan. Computer code operating instructions and listings of sample input and output files have been included.

  19. Using computer simulations to facilitate conceptual understanding of electromagnetic induction

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Fen

    This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit

  20. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.