Science.gov

Sample records for agent-based computer simulation

  1. Agent-based computer simulation and sirs: building a bridge between basic science and clinical trials.

    PubMed

    An, G

    2001-10-01

    The management of Systemic Inflammatory Response Syndrome (SIRS)/Multiple Organ Failure (MOF) remains the greatest challenge in the field of critical care. There has been uniform difficulty in translating the results of basic science research into effective therapeutic regimes. We propose that this is due in part to a failure to account for the complex, nonlinear nature of the inflammatory process of which SIRS/MOF represents a disordered state. Attempts to manipulate this process without an understanding of the dynamics of the system may potentially produce unintended consequences. Agent-Based Computer Simulation (ABCS) provides a means to synthesize the information acquired from the linear analysis of basic science into a model that preserves the complexity of the inflammatory system. We have constructed an abstracted version of the inflammatory process using an ABCS that is based at the cellular level. Despite its abstraction, the simulation produces non-linear behavior and reproduces the dynamic structure of the inflammatory response. Furthermore, adjustment of the simulation to model one of the unsuccessful initial anti-inflammatory trials of the 1990's demonstrates the adverse outcome that was observed in those clinical trials. It must be emphasized that the current model is extremely abstract and simplified. However, it is hoped that future ABCSs of sufficient sophistication eventually may provide an important bridging tool to translate basic science discoveries into clinical applications. Creating these simulations will require a large collaborative effort, and it is hoped that this paper will stimulate interest in this form of analysis. PMID:11580108

  2. Thread Group Multithreading: Accelerating the Computation of an Agent-Based Power System Modeling and Simulation Tool -- C GridLAB-D

    SciTech Connect

    Jin, Shuangshuang; Chassin, David P.

    2014-01-06

    GridLAB-DTM is an open source next generation agent-based smart-grid simulator that provides unprecedented capability to model the performance of smart grid technologies. Over the past few years, GridLAB-D has been used to conduct important analyses of smart grid concepts, but it is still quite limited by its computational performance. In order to break through the performance bottleneck to meet the need for large scale power grid simulations, we develop a thread group mechanism to implement highly granular multithreaded computation in GridLAB-D. We achieve close to linear speedups on multithreading version compared against the single-thread version of the same code running on general purpose multi-core commodity for a benchmark simple house model. The performance of the multithreading code shows favorable scalability properties and resource utilization, and much shorter execution time for large-scale power grid simulations.

  3. Tutorial on agent-based modeling and simulation.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2005-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS is a third way of doing science besides deductive and inductive reasoning. Computational advances have made possible a growing number of agent-based applications in a variety of fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling consumer behavior to understanding the fall of ancient civilizations, to name a few. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing ABMS models, and provides some thoughts on the relationship between ABMS and traditional modeling techniques.

  4. Agent-based modeling and simulation Part 3 : desktop ABMS.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2007-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of autonomous, interacting agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to support their research. Some have gone so far as to contend that ABMS 'is a third way of doing science,' in addition to traditional deductive and inductive reasoning (Axelrod 1997b). Computational advances have made possible a growing number of agent-based models across a variety of application domains. Applications range from modeling agent behavior in the stock market, supply chains, and consumer markets, to predicting the spread of epidemics, the threat of bio-warfare, and the factors responsible for the fall of ancient civilizations. This tutorial describes the theoretical and practical foundations of ABMS, identifies toolkits and methods for developing agent models, and illustrates the development of a simple agent-based model of shopper behavior using spreadsheets.

  5. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  6. Agent-based simulation of a financial market

    NASA Astrophysics Data System (ADS)

    Raberto, Marco; Cincotti, Silvano; Focardi, Sergio M.; Marchesi, Michele

    2001-10-01

    This paper introduces an agent-based artificial financial market in which heterogeneous agents trade one single asset through a realistic trading mechanism for price formation. Agents are initially endowed with a finite amount of cash and a given finite portfolio of assets. There is no money-creation process; the total available cash is conserved in time. In each period, agents make random buy and sell decisions that are constrained by available resources, subject to clustering, and dependent on the volatility of previous periods. The model proposed herein is able to reproduce the leptokurtic shape of the probability density of log price returns and the clustering of volatility. Implemented using extreme programming and object-oriented technology, the simulator is a flexible computational experimental facility that can find applications in both academic and industrial research projects.

  7. Modeling civil violence: An agent-based computational approach

    PubMed Central

    Epstein, Joshua M.

    2002-01-01

    This article presents an agent-based computational model of civil violence. Two variants of the civil violence model are presented. In the first a central authority seeks to suppress decentralized rebellion. In the second a central authority seeks to suppress communal violence between two warring ethnic groups. PMID:11997450

  8. Cognitive Modeling for Agent-Based Simulation of Child Maltreatment

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard

    This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.

  9. Validation techniques of agent based modelling for geospatial simulations

    NASA Astrophysics Data System (ADS)

    Darvishi, M.; Ahmadi, G.

    2014-10-01

    One of the most interesting aspects of modelling and simulation study is to describe the real world phenomena that have specific properties; especially those that are in large scales and have dynamic and complex behaviours. Studying these phenomena in the laboratory is costly and in most cases it is impossible. Therefore, Miniaturization of world phenomena in the framework of a model in order to simulate the real phenomena is a reasonable and scientific approach to understand the world. Agent-based modelling and simulation (ABMS) is a new modelling method comprising of multiple interacting agent. They have been used in the different areas; for instance, geographic information system (GIS), biology, economics, social science and computer science. The emergence of ABM toolkits in GIS software libraries (e.g. ESRI's ArcGIS, OpenMap, GeoTools, etc) for geospatial modelling is an indication of the growing interest of users to use of special capabilities of ABMS. Since ABMS is inherently similar to human cognition, therefore it could be built easily and applicable to wide range applications than a traditional simulation. But a key challenge about ABMS is difficulty in their validation and verification. Because of frequent emergence patterns, strong dynamics in the system and the complex nature of ABMS, it is hard to validate and verify ABMS by conventional validation methods. Therefore, attempt to find appropriate validation techniques for ABM seems to be necessary. In this paper, after reviewing on Principles and Concepts of ABM for and its applications, the validation techniques and challenges of ABM validation are discussed.

  10. High performance computing for three-dimensional agent-based molecular models.

    PubMed

    Pérez-Rodríguez, G; Pérez-Pérez, M; Fdez-Riverola, F; Lourenço, A

    2016-07-01

    Agent-based simulations are increasingly popular in exploring and understanding cellular systems, but the natural complexity of these systems and the desire to grasp different modelling levels demand cost-effective simulation strategies and tools. In this context, the present paper introduces novel sequential and distributed approaches for the three-dimensional agent-based simulation of individual molecules in cellular events. These approaches are able to describe the dimensions and position of the molecules with high accuracy and thus, study the critical effect of spatial distribution on cellular events. Moreover, two of the approaches allow multi-thread high performance simulations, distributing the three-dimensional model in a platform independent and computationally efficient way. Evaluation addressed the reproduction of molecular scenarios and different scalability aspects of agent creation and agent interaction. The three approaches simulate common biophysical and biochemical laws faithfully. The distributed approaches show improved performance when dealing with large agent populations while the sequential approach is better suited for small to medium size agent populations. Overall, the main new contribution of the approaches is the ability to simulate three-dimensional agent-based models at the molecular level with reduced implementation effort and moderate-level computational capacity. Since these approaches have a generic design, they have the major potential of being used in any event-driven agent-based tool. PMID:27372059

  11. On agent-based modeling and computational social science

    PubMed Central

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  12. On agent-based modeling and computational social science.

    PubMed

    Conte, Rosaria; Paolucci, Mario

    2014-01-01

    In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642

  13. Agent-Based Modeling and Simulation on Emergency Evacuation

    NASA Astrophysics Data System (ADS)

    Ren, Chuanjun; Yang, Chenghui; Jin, Shiyao

    Crowd stampedes and evacuation induced by panic caused by emergences often lead to fatalities as people are crushed, injured, trampled or even dead. Such phenomena may be triggered in life-threatening situations such as fires, explosions in crowded buildings. Emergency evacuation simulation has recently attracted the interest of a rapidly increasing number of scientists. This paper presents an Agent-Based Modeling and Simulation using Repast software to construct crowd evacuations for emergency response from an area under a fire. Various types of agents and different attributes of agents are designed in contrast to traditional modeling. The attributes that govern the characteristics of the people are studied and tested by iterative simulations. Simulations are also conducted to demonstrate the effect of various parameters of agents. Some interesting results were observed such as "faster is slower" and the ignorance of available exits. At last, simulation results suggest practical ways of minimizing the harmful consequences of such events and the existence of an optimal escape strategy.

  14. Using Agent Based Modeling (ABM) to Develop Cultural Interaction Simulations

    NASA Technical Reports Server (NTRS)

    Drucker, Nick; Jones, Phillip N.

    2012-01-01

    Today, most cultural training is based on or built around "cultural engagements" or discrete interactions between the individual learner and one or more cultural "others". Often, success in the engagement is the end or the objective. In reality, these interactions usually involve secondary and tertiary effects with potentially wide ranging consequences. The concern is that learning culture within a strict engagement context might lead to "checklist" cultural thinking that will not empower learners to understand the full consequence of their actions. We propose the use of agent based modeling (ABM) to collect, store, and, simulating the effects of social networks, promulgate engagement effects over time, distance, and consequence. The ABM development allows for rapid modification to re-create any number of population types, extending the applicability of the model to any requirement for social modeling.

  15. Simulation of convoy of unmanned vehicles using agent based modeling

    NASA Astrophysics Data System (ADS)

    Sharma, Sharad; Singh, Harpreet; Gerhart, G. R.

    2007-10-01

    There has been an increasing interest of unmanned vehicles keeping the importance of defense and security. A few models for a convoy of unmanned vehicle exist in literature. The objective of this paper is to exploit agent based modeling technique for a convoy of unmanned vehicles where each vehicle is an agent. Using this approach, the convoy of vehicles reaches a specified goal from a starting point. Each agent is associated with number of sensors. The agents make intelligent decisions based on sensor inputs and at the same time maintaining their group capability and behavior. The simulation is done for a battlefield environment from a single starting point to a single goal. This approach can be extended for multiple starting points to reach multiple goals. The simulation gives the time taken by the convoy to reach a goal from its initial position. In the battlefield environment, commanders make various tactical decisions depending upon the location of an enemy outpost, minefields, number of soldiers in platoons, and barriers. The simulation can help the commander to make effective decisions depending on battlefield, convoy and obstacles to reach a particular goal. The paper describes the proposed approach and gives the simulation results. The paper also gives problems for future research in this area.

  16. Agent-based modeling to simulate the dengue spread

    NASA Astrophysics Data System (ADS)

    Deng, Chengbin; Tao, Haiyan; Ye, Zhiwei

    2008-10-01

    In this paper, we introduce a novel method ABM in simulating the unique process for the dengue spread. Dengue is an acute infectious disease with a long history of over 200 years. Unlike the diseases that can be transmitted directly from person to person, dengue spreads through a must vector of mosquitoes. There is still no any special effective medicine and vaccine for dengue up till now. The best way to prevent dengue spread is to take precautions beforehand. Thus, it is crucial to detect and study the dynamic process of dengue spread that closely relates to human-environment interactions where Agent-Based Modeling (ABM) effectively works. The model attempts to simulate the dengue spread in a more realistic way in the bottom-up way, and to overcome the limitation of ABM, namely overlooking the influence of geographic and environmental factors. Considering the influence of environment, Aedes aegypti ecology and other epidemiological characteristics of dengue spread, ABM can be regarded as a useful way to simulate the whole process so as to disclose the essence of the evolution of dengue spread.

  17. Patient-centered appointment scheduling using agent-based simulation.

    PubMed

    Turkcan, Ayten; Toscos, Tammy; Doebbeling, Brad N

    2014-01-01

    Enhanced access and continuity are key components of patient-centered care. Existing studies show that several interventions such as providing same day appointments, walk-in services, after-hours care, and group appointments, have been used to redesign the healthcare systems for improved access to primary care. However, an intervention focusing on a single component of care delivery (i.e. improving access to acute care) might have a negative impact other components of the system (i.e. reduced continuity of care for chronic patients). Therefore, primary care clinics should consider implementing multiple interventions tailored for their patient population needs. We collected rapid ethnography and observations to better understand clinic workflow and key constraints. We then developed an agent-based simulation model that includes all access modalities (appointments, walk-ins, and after-hours access), incorporate resources and key constraints and determine the best appointment scheduling method that improves access and continuity of care. This paper demonstrates the value of simulation models to test a variety of alternative strategies to improve access to care through scheduling. PMID:25954423

  18. Serious games experiment toward agent-based simulation

    USGS Publications Warehouse

    Wein, Anne; Labiosa, William

    2013-01-01

    We evaluate the potential for serious games to be used as a scientifically based decision-support product that supports the United States Geological Survey’s (USGS) mission--to provide integrated, unbiased scientific information that can make a substantial contribution to societal well-being for a wide variety of complex environmental challenges. Serious or pedagogical games are an engaging way to educate decisionmakers and stakeholders about environmental challenges that are usefully informed by natural and social scientific information and knowledge and can be designed to promote interactive learning and exploration in the face of large uncertainties, divergent values, and complex situations. We developed two serious games that use challenging environmental-planning issues to demonstrate and investigate the potential contributions of serious games to inform regional-planning decisions. Delta Skelta is a game emulating long-term integrated environmental planning in the Sacramento-San Joaquin Delta, California, that incorporates natural hazards (flooding and earthquakes) and consequences for California water supplies amidst conflicting water interests. Age of Ecology is a game that simulates interactions between economic and ecologic processes, as well as natural hazards while implementing agent-based modeling. The content of these games spans the USGS science mission areas related to water, ecosystems, natural hazards, land use, and climate change. We describe the games, reflect on design and informational aspects, and comment on their potential usefulness. During the process of developing these games, we identified various design trade-offs involving factual information, strategic thinking, game-winning criteria, elements of fun, number and type of players, time horizon, and uncertainty. We evaluate the two games in terms of accomplishments and limitations. Overall, we demonstrated the potential for these games to usefully represent scientific information

  19. Tutorial on agent-based modeling and simulation. Part 2 : how to model with agents.

    SciTech Connect

    Macal, C. M.; North, M. J.; Decision and Information Sciences

    2006-01-01

    Agent-based modeling and simulation (ABMS) is a new approach to modeling systems comprised of interacting autonomous agents. ABMS promises to have far-reaching effects on the way that businesses use computers to support decision-making and researchers use electronic laboratories to do research. Some have gone so far as to contend that ABMS is a new way of doing science. Computational advances make possible a growing number of agent-based applications across many fields. Applications range from modeling agent behavior in the stock market and supply chains, to predicting the spread of epidemics and the threat of bio-warfare, from modeling the growth and decline of ancient civilizations to modeling the complexities of the human immune system, and many more. This tutorial describes the foundations of ABMS, identifies ABMS toolkits and development methods illustrated through a supply chain example, and provides thoughts on the appropriate contexts for ABMS versus conventional modeling techniques.

  20. Comparing large-scale computational approaches to epidemic modeling: Agent-based versus structured metapopulation models

    PubMed Central

    2010-01-01

    Background In recent years large-scale computational models for the realistic simulation of epidemic outbreaks have been used with increased frequency. Methodologies adapt to the scale of interest and range from very detailed agent-based models to spatially-structured metapopulation models. One major issue thus concerns to what extent the geotemporal spreading pattern found by different modeling approaches may differ and depend on the different approximations and assumptions used. Methods We provide for the first time a side-by-side comparison of the results obtained with a stochastic agent-based model and a structured metapopulation stochastic model for the progression of a baseline pandemic event in Italy, a large and geographically heterogeneous European country. The agent-based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM) model, based on high-resolution census data worldwide, and integrating airline travel flow data with short-range human mobility patterns at the global scale. The model also considers age structure data for Italy. GLEaM and the agent-based models are synchronized in their initial conditions by using the same disease parameterization, and by defining the same importation of infected cases from international travels. Results The results obtained show that both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing on the order of a few days. The relative difference of the epidemic size depends on the basic reproductive ratio, R0, and on the fact that the metapopulation model consistently yields a larger incidence than the agent-based model, as expected due to the differences in the structure in the intra-population contact pattern of the approaches. The age breakdown analysis shows

  1. Model reduction for agent-based social simulation: coarse-graining a civil violence model.

    PubMed

    Zou, Yu; Fonoberov, Vladimir A; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G

    2012-06-01

    Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20). PMID:23005161

  2. Model reduction for agent-based social simulation: Coarse-graining a civil violence model

    NASA Astrophysics Data System (ADS)

    Zou, Yu; Fonoberov, Vladimir A.; Fonoberova, Maria; Mezic, Igor; Kevrekidis, Ioannis G.

    2012-06-01

    Agent-based modeling (ABM) constitutes a powerful computational tool for the exploration of phenomena involving emergent dynamic behavior in the social sciences. This paper demonstrates a computer-assisted approach that bridges the significant gap between the single-agent microscopic level and the macroscopic (coarse-grained population) level, where fundamental questions must be rationally answered and policies guiding the emergent dynamics devised. Our approach will be illustrated through an agent-based model of civil violence. This spatiotemporally varying ABM incorporates interactions between a heterogeneous population of citizens [active (insurgent), inactive, or jailed] and a population of police officers. Detailed simulations exhibit an equilibrium punctuated by periods of social upheavals. We show how to effectively reduce the agent-based dynamics to a stochastic model with only two coarse-grained degrees of freedom: the number of jailed citizens and the number of active ones. The coarse-grained model captures the ABM dynamics while drastically reducing the computation time (by a factor of approximately 20).

  3. Identifying Evacuees' Demand of Tsunami Shelters using Agent Based Simulation

    NASA Astrophysics Data System (ADS)

    Mas, E.; Adriano, B.; Koshimura, S.; Imamura, F.; Kuroiwa, J.; Yamazaki, F.; Zavala, C.; Estrada, M.

    2012-12-01

    Amongst the lessons learned in tsunami events such as the 2004 Indian Ocean and 2011 Great Tohoku Japan earthquake is that sometimes nature exceeds structural countermeasures like seawalls, breakwaters or tsunami gates. In such situations it is a challenging task for people in plain areas to find sheltering places. The vertical evacuation to multistory buildings is one alternative to provide areas for sheltering in a complex environment of evacuation. However, if the spatial distribution and the available capacity of these structures are not well displayed, conditions of evacuee over-demand or under-demand might be observed in several structures. In this study, we present the integration of the tsunami numerical modeling and the agent based simulation of evacuation as the method to estimate the sheltering demand of evacuees in an emergent behavior approach. The case study is set in La Punta district in Peru. Here, we used in the tsunami simulation a seismic source of slip distribution model (Pulido et.al. ,2011; Chlieh et.al, 2011) for a possible future tsunami scenario in the central Andes. We modeled three alternatives of evacuation. First, the horizontal evacuation scenario was analyzed to support the necessity of the sheltering-in-place option for the district. Second, the vertical evacuation scenario and third, the combination of vertical and horizontal evacuation scenarios of pedestrians and vehicles were conducted. In the last two alternatives, the demand of evacuees were measured at each official tsunami evacuation building and compared to the sheltering capacity of the structure. Results showed that out of twenty tsunami evacuation buildings, thirteen resulted with over-demands and seven were still with available space. Also it is confirmed that in this case the horizontal evacuation might lead to a high number of casualties due to the traffic congestion at the neck of the district. Finally the vertical evacuation would be a suitable solution for this area

  4. An agent-based computational model of the spread of tuberculosis

    NASA Astrophysics Data System (ADS)

    de Espíndola, Aquino L.; Bauch, Chris T.; Troca Cabella, Brenno C.; Souto Martinez, Alexandre

    2011-05-01

    In this work we propose an alternative model of the spread of tuberculosis (TB) and the emergence of drug resistance due to the treatment with antibiotics. We implement the simulations by an agent-based model computational approach where the spatial structure is taken into account. The spread of tuberculosis occurs according to probabilities defined by the interactions among individuals. The model was validated by reproducing results already known from the literature in which different treatment regimes yield the emergence of drug resistance. The different patterns of TB spread can be visualized at any time of the system evolution. The implementation details as well as some results of this alternative approach are discussed.

  5. Graceful Failure and Societal Resilience Analysis Via Agent-Based Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Schopf, P. S.; Cioffi-Revilla, C.; Rogers, J. D.; Bassett, J.; Hailegiorgis, A. B.

    2014-12-01

    Agent-based social modeling is opening up new methodologies for the study of societal response to weather and climate hazards, and providing measures of resiliency that can be studied in many contexts, particularly in coupled human and natural-technological systems (CHANTS). Since CHANTS are complex adaptive systems, societal resiliency may or may not occur, depending on dynamics that lack closed form solutions. Agent-based modeling has been shown to provide a viable theoretical and methodological approach for analyzing and understanding disasters and societal resiliency in CHANTS. Our approach advances the science of societal resilience through computational modeling and simulation methods that complement earlier statistical and mathematical approaches. We present three case studies of social dynamics modeling that demonstrate the use of these agent based models. In Central Asia, we exmaine mutltiple ensemble simulations with varying climate statistics to see how droughts and zuds affect populations, transmission of wealth across generations, and the overall structure of the social system. In Eastern Africa, we explore how successive episodes of drought events affect the adaptive capacity of rural households. Human displacement, mainly, rural to urban migration, and livelihood transition particularly from pastoral to farming are observed as rural households interacting dynamically with the biophysical environment and continually adjust their behavior to accommodate changes in climate. In the far north case we demonstrate one of the first successful attempts to model the complete climate-permafrost-infrastructure-societal interaction network as a complex adaptive system/CHANTS implemented as a ``federated'' agent-based model using evolutionary computation. Analysis of population changes resulting from extreme weather across these and other cases provides evidence for the emergence of new steady states and shifting patterns of resilience.

  6. Agent-based modeling: Methods and techniques for simulating human systems

    PubMed Central

    Bonabeau, Eric

    2002-01-01

    Agent-based modeling is a powerful simulation modeling technique that has seen a number of applications in the last few years, including applications to real-world business problems. After the basic principles of agent-based simulation are briefly introduced, its four areas of application are discussed by using real-world applications: flow simulation, organizational simulation, market simulation, and diffusion simulation. For each category, one or several business applications are described and analyzed. PMID:12011407

  7. A Systematic Review of Agent-Based Modelling and Simulation Applications in the Higher Education Domain

    ERIC Educational Resources Information Center

    Gu, X.; Blackmore, K. L.

    2015-01-01

    This paper presents the results of a systematic review of agent-based modelling and simulation (ABMS) applications in the higher education (HE) domain. Agent-based modelling is a "bottom-up" modelling paradigm in which system-level behaviour (macro) is modelled through the behaviour of individual local-level agent interactions (micro).…

  8. AN AGENT-BASED SIMULATION STUDY OF A COMPLEX ADAPTIVE COLLABORATION NETWORK

    SciTech Connect

    Ozmen, Ozgur; Smith, Jeffrey; Yilmaz, Levent

    2013-01-01

    One of the most significant problems in organizational scholarship is to discern how social collectives govern, organize, and coordinate the actions of individuals to achieve collective outcomes. The collectives are usually interpreted as complex adaptive systems (CAS). The understanding of CAS is more likely to arise with the help of computer-based simulations. In this tutorial, using agent-based modeling approach, a complex adaptive social communication network model is introduced. The objective is to present the underlying dynamics of the system in a form of computer simulation that enables analyzing the impacts of various mechanisms on network topologies and emergent behaviors. The ultimate goal is to further our understanding of the dynamics in the system and facilitate developing informed policies for decision-makers.

  9. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery

    PubMed Central

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  10. Computational Research on Mobile Pastoralism Using Agent-Based Modeling and Satellite Imagery.

    PubMed

    Sakamoto, Takuto

    2016-01-01

    Dryland pastoralism has long attracted considerable attention from researchers in diverse fields. However, rigorous formal study is made difficult by the high level of mobility of pastoralists as well as by the sizable spatio-temporal variability of their environment. This article presents a new computational approach for studying mobile pastoralism that overcomes these issues. Combining multi-temporal satellite images and agent-based modeling allows a comprehensive examination of pastoral resource access over a realistic dryland landscape with unpredictable ecological dynamics. The article demonstrates the analytical potential of this approach through its application to mobile pastoralism in northeast Nigeria. Employing more than 100 satellite images of the area, extensive simulations are conducted under a wide array of circumstances, including different land-use constraints. The simulation results reveal complex dependencies of pastoral resource access on these circumstances along with persistent patterns of seasonal land use observed at the macro level. PMID:26963526

  11. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.

    PubMed

    Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M

    2015-09-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods. PMID:26158406

  12. Combining Computational Fluid Dynamics and Agent-Based Modeling: A New Approach to Evacuation Planning

    PubMed Central

    Epstein, Joshua M.; Pankajakshan, Ramesh; Hammond, Ross A.

    2011-01-01

    We introduce a novel hybrid of two fields—Computational Fluid Dynamics (CFD) and Agent-Based Modeling (ABM)—as a powerful new technique for urban evacuation planning. CFD is a predominant technique for modeling airborne transport of contaminants, while ABM is a powerful approach for modeling social dynamics in populations of adaptive individuals. The hybrid CFD-ABM method is capable of simulating how large, spatially-distributed populations might respond to a physically realistic contaminant plume. We demonstrate the overall feasibility of CFD-ABM evacuation design, using the case of a hypothetical aerosol release in Los Angeles to explore potential effectiveness of various policy regimes. We conclude by arguing that this new approach can be powerfully applied to arbitrary population centers, offering an unprecedented preparedness and catastrophic event response tool. PMID:21687788

  13. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    ERIC Educational Resources Information Center

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-01-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these…

  14. Learning Natural Selection in 4th Grade with Multi-Agent-Based Computational Models

    NASA Astrophysics Data System (ADS)

    Dickes, Amanda Catherine; Sengupta, Pratim

    2013-06-01

    In this paper, we investigate how elementary school students develop multi-level explanations of population dynamics in a simple predator-prey ecosystem, through scaffolded interactions with a multi-agent-based computational model (MABM). The term "agent" in an MABM indicates individual computational objects or actors (e.g., cars), and these agents obey simple rules assigned or manipulated by the user (e.g., speeding up, slowing down, etc.). It is the interactions between these agents, based on the rules assigned by the user, that give rise to emergent, aggregate-level behavior (e.g., formation and movement of the traffic jam). Natural selection is such an emergent phenomenon, which has been shown to be challenging for novices (K16 students) to understand. Whereas prior research on learning evolutionary phenomena with MABMs has typically focused on high school students and beyond, we investigate how elementary students (4th graders) develop multi-level explanations of some introductory aspects of natural selection—species differentiation and population change—through scaffolded interactions with an MABM that simulates predator-prey dynamics in a simple birds-butterflies ecosystem. We conducted a semi-clinical interview based study with ten participants, in which we focused on the following: a) identifying the nature of learners' initial interpretations of salient events or elements of the represented phenomena, b) identifying the roles these interpretations play in the development of their multi-level explanations, and c) how attending to different levels of the relevant phenomena can make explicit different mechanisms to the learners. In addition, our analysis also shows that although there were differences between high- and low-performing students (in terms of being able to explain population-level behaviors) in the pre-test, these differences disappeared in the post-test.

  15. An agent-based computational model for tuberculosis spreading on age-structured populations

    NASA Astrophysics Data System (ADS)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  16. Method for distributed agent-based non-expert simulation of manufacturing process behavior

    DOEpatents

    Ivezic, Nenad; Potok, Thomas E.

    2004-11-30

    A method for distributed agent based non-expert simulation of manufacturing process behavior on a single-processor computer comprises the steps of: object modeling a manufacturing technique having a plurality of processes; associating a distributed agent with each the process; and, programming each the agent to respond to discrete events corresponding to the manufacturing technique, wherein each discrete event triggers a programmed response. The method can further comprise the step of transmitting the discrete events to each agent in a message loop. In addition, the programming step comprises the step of conditioning each agent to respond to a discrete event selected from the group consisting of a clock tick message, a resources received message, and a request for output production message.

  17. Agent-Based Knowledge Discovery for Modeling and Simulation

    SciTech Connect

    Haack, Jereme N.; Cowell, Andrew J.; Marshall, Eric J.; Fligg, Alan K.; Gregory, Michelle L.; McGrath, Liam R.

    2009-09-15

    This paper describes an approach to using agent technology to extend the automated discovery mechanism of the Knowledge Encapsulation Framework (KEF). KEF is a suite of tools to enable the linking of knowledge inputs (relevant, domain-specific evidence) to modeling and simulation projects, as well as other domains that require an effective collaborative workspace for knowledge-based tasks. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a semantic wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  18. Parallel Agent-Based Simulations on Clusters of GPUs and Multi-Core Processors

    SciTech Connect

    Aaby, Brandon G; Perumalla, Kalyan S; Seal, Sudip K

    2010-01-01

    An effective latency-hiding mechanism is presented in the parallelization of agent-based model simulations (ABMS) with millions of agents. The mechanism is designed to accommodate the hierarchical organization as well as heterogeneity of current state-of-the-art parallel computing platforms. We use it to explore the computation vs. communication trade-off continuum available with the deep computational and memory hierarchies of extant platforms and present a novel analytical model of the tradeoff. We describe our implementation and report preliminary performance results on two distinct parallel platforms suitable for ABMS: CUDA threads on multiple, networked graphical processing units (GPUs), and pthreads on multi-core processors. Message Passing Interface (MPI) is used for inter-GPU as well as inter-socket communication on a cluster of multiple GPUs and multi-core processors. Results indicate the benefits of our latency-hiding scheme, delivering as much as over 100-fold improvement in runtime for certain benchmark ABMS application scenarios with several million agents. This speed improvement is obtained on our system that is already two to three orders of magnitude faster on one GPU than an equivalent CPU-based execution in a popular simulator in Java. Thus, the overall execution of our current work is over four orders of magnitude faster when executed on multiple GPUs.

  19. A Scaffolding Framework to Support Learning of Emergent Phenomena Using Multi-Agent-Based Simulation Environments

    NASA Astrophysics Data System (ADS)

    Basu, Satabdi; Sengupta, Pratim; Biswas, Gautam

    2015-04-01

    Students from middle school to college have difficulties in interpreting and understanding complex systems such as ecological phenomena. Researchers have suggested that students experience difficulties in reconciling the relationships between individuals, populations, and species, as well as the interactions between organisms and their environment in the ecosystem. Multi-agent-based computational models (MABMs) can explicitly capture agents and their interactions by representing individual actors as computational objects with assigned rules. As a result, the collective aggregate-level behavior of the population dynamically emerges from simulations that generate the aggregation of these interactions. Past studies have used a variety of scaffolds to help students learn ecological phenomena. Yet, there is no theoretical framework that supports the systematic design of scaffolds to aid students' learning in MABMs. Our paper addresses this issue by proposing a comprehensive framework for the design, analysis, and evaluation of scaffolding to support students' learning of ecology in a MABM. We present a study in which middle school students used a MABM to investigate and learn about a desert ecosystem. We identify the different types of scaffolds needed to support inquiry learning activities in this simulation environment and use our theoretical framework to demonstrate the effectiveness of our scaffolds in helping students develop a deep understanding of the complex ecological behaviors represented in the simulation..

  20. Efficient Allocation of Resources for Defense of Spatially Distributed Networks Using Agent-Based Simulation.

    PubMed

    Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A

    2015-09-01

    This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach. PMID:25683347

  1. An agent-based simulation of extirpation of Ceratitis capitata applied to invasions in California

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We describe and validate an Agent-Based Simulation(ABS) of invasive insects and use it to investigate the time to extirpation of Ceratitis capitata using data from seven outbreaks that occurred in California from 2008-2010. Results are compared with the length of intervention and quarantine imposed ...

  2. Simulating tissue mechanics with agent-based models: concepts, perspectives and some novel results

    NASA Astrophysics Data System (ADS)

    Van Liedekerke, P.; Palm, M. M.; Jagiella, N.; Drasdo, D.

    2015-12-01

    In this paper we present an overview of agent-based models that are used to simulate mechanical and physiological phenomena in cells and tissues, and we discuss underlying concepts, limitations, and future perspectives of these models. As the interest in cell and tissue mechanics increase, agent-based models are becoming more common the modeling community. We overview the physical aspects, complexity, shortcomings, and capabilities of the major agent-based model categories: lattice-based models (cellular automata, lattice gas cellular automata, cellular Potts models), off-lattice models (center-based models, deformable cell models, vertex models), and hybrid discrete-continuum models. In this way, we hope to assist future researchers in choosing a model for the phenomenon they want to model and understand. The article also contains some novel results.

  3. Applying GIS and high performance agent-based simulation for managing an Old World Screwworm fly invasion of Australia.

    PubMed

    Welch, M C; Kwan, P W; Sajeev, A S M

    2014-10-01

    Agent-based modelling has proven to be a promising approach for developing rich simulations for complex phenomena that provide decision support functions across a broad range of areas including biological, social and agricultural sciences. This paper demonstrates how high performance computing technologies, namely General-Purpose Computing on Graphics Processing Units (GPGPU), and commercial Geographic Information Systems (GIS) can be applied to develop a national scale, agent-based simulation of an incursion of Old World Screwworm fly (OWS fly) into the Australian mainland. The development of this simulation model leverages the combination of massively data-parallel processing capabilities supported by NVidia's Compute Unified Device Architecture (CUDA) and the advanced spatial visualisation capabilities of GIS. These technologies have enabled the implementation of an individual-based, stochastic lifecycle and dispersal algorithm for the OWS fly invasion. The simulation model draws upon a wide range of biological data as input to stochastically determine the reproduction and survival of the OWS fly through the different stages of its lifecycle and dispersal of gravid females. Through this model, a highly efficient computational platform has been developed for studying the effectiveness of control and mitigation strategies and their associated economic impact on livestock industries can be materialised. PMID:24705073

  4. A Participatory Agent-Based Simulation for Indoor Evacuation Supported by Google Glass.

    PubMed

    Sánchez, Jesús M; Carrera, Álvaro; Iglesias, Carlos Á; Serrano, Emilio

    2016-01-01

    Indoor evacuation systems are needed for rescue and safety management. One of the challenges is to provide users with personalized evacuation routes in real time. To this end, this project aims at exploring the possibilities of Google Glass technology for participatory multiagent indoor evacuation simulations. Participatory multiagent simulation combines scenario-guided agents and humans equipped with Google Glass that coexist in a shared virtual space and jointly perform simulations. The paper proposes an architecture for participatory multiagent simulation in order to combine devices (Google Glass and/or smartphones) with an agent-based social simulator and indoor tracking services. PMID:27563911

  5. Use of agent-based simulations to design and interpret HIV clinical trials.

    PubMed

    Cuadros, Diego F; Abu-Raddad, Laith J; Awad, Susanne F; García-Ramos, Gisela

    2014-07-01

    In this study, we illustrate the utility of an agent-based simulation to inform a trial design and how this supports outcome interpretation of randomized controlled trials (RCTs). We developed agent-based Monte Carlo models to simulate existing landmark HIV RCTs, such as the Partners in Prevention HSV/HIV Transmission Study. We simulated a variation of this study using valacyclovir therapy as the intervention, and we used a male circumcision RCT based on the Rakai Male Circumcision Trial. Our results indicate that a small fraction (20%) of the simulated Partners in Prevention HSV/HIV Transmission Study realizations rejected the null hypothesis, which was no effect from the intervention. Our results also suggest that an RCT designed to evaluate the effectiveness of a more potent drug regimen for HSV-2 suppression (valacyclovir therapy) is more likely to identify the efficacy of the intervention. For the male circumcision RCT simulation, the greater biological effect of the male circumcision yielded a major fraction (81%) of RCT realizations' that rejects the null hypothesis, which was no effect from the intervention. Our study highlights how agent-based simulations synthesize individual variation in the epidemiological context of the RCT. This methodology will be particularly useful for designing RCTs aimed at evaluating combination prevention interventions in community-based RCTs, wherein an intervention׳s effectiveness is challenging to predict. PMID:24792492

  6. Comparing stochastic differential equations and agent-based modelling and simulation for early-stage cancer.

    PubMed

    Figueredo, Grazziela P; Siebers, Peer-Olaf; Owen, Markus R; Reps, Jenna; Aickelin, Uwe

    2014-01-01

    There is great potential to be explored regarding the use of agent-based modelling and simulation as an alternative paradigm to investigate early-stage cancer interactions with the immune system. It does not suffer from some limitations of ordinary differential equation models, such as the lack of stochasticity, representation of individual behaviours rather than aggregates and individual memory. In this paper we investigate the potential contribution of agent-based modelling and simulation when contrasted with stochastic versions of ODE models using early-stage cancer examples. We seek answers to the following questions: (1) Does this new stochastic formulation produce similar results to the agent-based version? (2) Can these methods be used interchangeably? (3) Do agent-based models outcomes reveal any benefit when compared to the Gillespie results? To answer these research questions we investigate three well-established mathematical models describing interactions between tumour cells and immune elements. These case studies were re-conceptualised under an agent-based perspective and also converted to the Gillespie algorithm formulation. Our interest in this work, therefore, is to establish a methodological discussion regarding the usability of different simulation approaches, rather than provide further biological insights into the investigated case studies. Our results show that it is possible to obtain equivalent models that implement the same mechanisms; however, the incapacity of the Gillespie algorithm to retain individual memory of past events affects the similarity of some results. Furthermore, the emergent behaviour of ABMS produces extra patters of behaviour in the system, which was not obtained by the Gillespie algorithm. PMID:24752131

  7. Agent-based simulation of building evacuation using a grid graph-based model

    NASA Astrophysics Data System (ADS)

    Tan, L.; Lin, H.; Hu, M.; Che, W.

    2014-02-01

    Shifting from macroscope models to microscope models, the agent-based approach has been widely used to model crowd evacuation as more attentions are paid on individualized behaviour. Since indoor evacuation behaviour is closely related to spatial features of the building, effective representation of indoor space is essential for the simulation of building evacuation. The traditional cell-based representation has limitations in reflecting spatial structure and is not suitable for topology analysis. Aiming at incorporating powerful topology analysis functions of GIS to facilitate agent-based simulation of building evacuation, we used a grid graph-based model in this study to represent the indoor space. Such model allows us to establish an evacuation network at a micro level. Potential escape routes from each node thus could be analysed through GIS functions of network analysis considering both the spatial structure and route capacity. This would better support agent-based modelling of evacuees' behaviour including route choice and local movements. As a case study, we conducted a simulation of emergency evacuation from the second floor of an official building using Agent Analyst as the simulation platform. The results demonstrate the feasibility of the proposed method, as well as the potential of GIS in visualizing and analysing simulation results.

  8. Biophysically Realistic Filament Bending Dynamics in Agent-Based Biological Simulation

    PubMed Central

    Alberts, Jonathan B.

    2009-01-01

    An appealing tool for study of the complex biological behaviors that can emerge from networks of simple molecular interactions is an agent-based, computational simulation that explicitly tracks small-scale local interactions – following thousands to millions of states through time. For many critical cell processes (e.g. cytokinetic furrow specification, nuclear centration, cytokinesis), the flexible nature of cytoskeletal filaments is likely to be critical. Any computer model that hopes to explain the complex emergent behaviors in these processes therefore needs to encode filament flexibility in a realistic manner. Here I present a numerically convenient and biophysically realistic method for modeling cytoskeletal filament flexibility in silico. Each cytoskeletal filament is represented by a series of rigid segments linked end-to-end in series with a variable attachment point for the translational elastic element. This connection scheme allows an empirically tuning, for a wide range of segment sizes, viscosities, and time-steps, that endows any filament species with the experimentally observed (or theoretically expected) static force deflection, relaxation time-constant, and thermal writhing motions. I additionally employ a unique pair of elastic elements – one representing the axial and the other the bending rigidity– that formulate the restoring force in terms of single time-step constraint resolution. This method is highly local –adjacent rigid segments of a filament only interact with one another through constraint forces—and is thus well-suited to simulations in which arbitrary additional forces (e.g. those representing interactions of a filament with other bodies or cross-links / entanglements between filaments) may be present. Implementation in code is straightforward; Java source code is available at www.celldynamics.org. PMID:19283085

  9. Comparing large-scale computational approaches to epidemic modeling: agent based versus structured metapopulation models

    NASA Astrophysics Data System (ADS)

    Gonçalves, Bruno; Ajelli, Marco; Balcan, Duygu; Colizza, Vittoria; Hu, Hao; Ramasco, José; Merler, Stefano; Vespignani, Alessandro

    2010-03-01

    We provide for the first time a side by side comparison of the results obtained with a stochastic agent based model and a structured metapopulation stochastic model for the evolution of a baseline pandemic event in Italy. The Agent Based model is based on the explicit representation of the Italian population through highly detailed data on the socio-demographic structure. The metapopulation simulations use the GLobal Epidemic and Mobility (GLEaM) model, based on high resolution census data worldwide, and integrating airline travel flow data with short range human mobility patterns at the global scale. Both models provide epidemic patterns that are in very good agreement at the granularity levels accessible by both approaches, with differences in peak timing of the order of few days. The age breakdown analysis shows that similar attack rates are obtained for the younger age classes.

  10. An Agent-Based Model of New Venture Creation: Conceptual Design for Simulating Entrepreneurship

    NASA Technical Reports Server (NTRS)

    Provance, Mike; Collins, Andrew; Carayannis, Elias

    2012-01-01

    There is a growing debate over the means by which regions can foster the growth of entrepreneurial activity in order to stimulate recovery and growth of their economies. On one side, agglomeration theory suggests the regions grow because of strong clusters that foster knowledge spillover locally; on the other side, the entrepreneurial action camp argues that innovative business models are generated by entrepreneurs with unique market perspectives who draw on knowledge from more distant domains. We will show you the design for a novel agent-based model of new venture creation that will demonstrate the relationship between agglomeration and action. The primary focus of this model is information exchange as the medium for these agent interactions. Our modeling and simulation study proposes to reveal interesting relationships in these perspectives, offer a foundation on which these disparate theories from economics and sociology can find common ground, and expand the use of agent-based modeling into entrepreneurship research.

  11. Collaborative Multi-Agent Based Simulations: Stakeholder-Focused Innovation in Water Resources Management and Decision-Support Modeling

    NASA Astrophysics Data System (ADS)

    Kock, B. E.

    2006-12-01

    The combined use of multi-agent based simulations and collaborative modeling approaches is emerging as a highly effective tool for representing complex coupled social-biophysical water resource systems. A collaboratively-designed, multi-agent based simulation can be used both as a decision-support tool and as a didactic method for improving stakeholder understanding and engagement with water resources policymaking and management. Major technical and non-technical obstacles remain to the efficient and effective development of multi-agent models of human society, to integrating these models with GIS and other numerical models, and to building a process for engaging stakeholders with model design, implementation and use. It is proposed here to tackle some of these obstacles through a collaborative multi-agent based simulation process framework, intended for practical use in resolving disputes and environmental challenges over sustainable irrigated agriculture in the Western United States. A practical implementation of this framework will be conducted in collaboration with a diverse stakeholder group representing farmers and local, state and federal water managers. Through the use of simulation gaming, interviewing and computer-based knowledge elicitation, a multi-agent model representing local and regional social dynamics will be developed to support the acceptable and sustainable implementation of management alternatives for reducing regional problems of salinization and high selenium concentrations in soils and irrigation water. The development of a socially and scientifically credible simulation platform in this setting can make a significant contribution to ensuring the non-adversarial use of high quality science, enhance the engagement of stakeholders with policymaking, and help meet the challenges of integrating dynamic models of human society with more traditional biophysical systems models.

  12. Using an agent-based model to simulate children’s active travel to school

    PubMed Central

    2013-01-01

    Background Despite the multiple advantages of active travel to school, only a small percentage of US children and adolescents walk or bicycle to school. Intervention studies are in a relatively early stage and evidence of their effectiveness over long periods is limited. The purpose of this study was to illustrate the utility of agent-based models in exploring how various policies may influence children’s active travel to school. Methods An agent-based model was developed to simulate children’s school travel behavior within a hypothetical city. The model was used to explore the plausible implications of policies targeting two established barriers to active school travel: long distance to school and traffic safety. The percent of children who walk to school was compared for various scenarios. Results To maximize the percent of children who walk to school the school locations should be evenly distributed over space and children should be assigned to the closest school. In the case of interventions to improve traffic safety, targeting a smaller area around the school with greater intensity may be more effective than targeting a larger area with less intensity. Conclusions Despite the challenges they present, agent based models are a useful complement to other analytical strategies in studying the plausible impact of various policies on active travel to school. PMID:23705953

  13. An operational epidemiological model for calibrating agent-based simulations of pandemic influenza outbreaks.

    PubMed

    Prieto, D; Das, T K

    2016-03-01

    Uncertainty of pandemic influenza viruses continue to cause major preparedness challenges for public health policymakers. Decisions to mitigate influenza outbreaks often involve tradeoff between the social costs of interventions (e.g., school closure) and the cost of uncontrolled spread of the virus. To achieve a balance, policymakers must assess the impact of mitigation strategies once an outbreak begins and the virus characteristics are known. Agent-based (AB) simulation is a useful tool for building highly granular disease spread models incorporating the epidemiological features of the virus as well as the demographic and social behavioral attributes of tens of millions of affected people. Such disease spread models provide excellent basis on which various mitigation strategies can be tested, before they are adopted and implemented by the policymakers. However, to serve as a testbed for the mitigation strategies, the AB simulation models must be operational. A critical requirement for operational AB models is that they are amenable for quick and simple calibration. The calibration process works as follows: the AB model accepts information available from the field and uses those to update its parameters such that some of its outputs in turn replicate the field data. In this paper, we present our epidemiological model based calibration methodology that has a low computational complexity and is easy to interpret. Our model accepts a field estimate of the basic reproduction number, and then uses it to update (calibrate) the infection probabilities in a way that its effect combined with the effects of the given virus epidemiology, demographics, and social behavior results in an infection pattern yielding a similar value of the basic reproduction number. We evaluate the accuracy of the calibration methodology by applying it for an AB simulation model mimicking a regional outbreak in the US. The calibrated model is shown to yield infection patterns closely replicating

  14. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    DOE PAGESBeta

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-01-01

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less

  15. Quantitative agent-based firm dynamics simulation with parameters estimated by financial and transaction data analysis

    NASA Astrophysics Data System (ADS)

    Ikeda, Yuichi; Souma, Wataru; Aoyama, Hideaki; Iyetomi, Hiroshi; Fujiwara, Yoshi; Kaizoji, Taisei

    2007-03-01

    Firm dynamics on a transaction network is considered from the standpoint of econophysics, agent-based simulations, and game theory. In this model, interacting firms rationally invest in a production facility to maximize net present value. We estimate parameters used in the model through empirical analysis of financial and transaction data. We propose two different methods ( analytical method and regression method) to obtain an interaction matrix of firms. On a subset of a real transaction network, we simulate firm's revenue, cost, and fixed asset, which is the accumulated investment for the production facility. The simulation reproduces the quantitative behavior of past revenues and costs within a standard error when we use the interaction matrix estimated by the regression method, in which only transaction pairs are taken into account. Furthermore, the simulation qualitatively reproduces past data of fixed assets.

  16. GridLAB-D: An Agent-Based Simulation Framework for Smart Grids

    SciTech Connect

    Chassin, David P.; Fuller, Jason C.; Djilali, Ned

    2014-06-23

    Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control system design, and integration of wind power in a smart grid.

  17. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    SciTech Connect

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran; Djilali, Ned

    2015-12-23

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methods presented.

  18. An Agent-Based Labor Market Simulation with Endogenous Skill-Demand

    NASA Astrophysics Data System (ADS)

    Gemkow, S.

    This paper considers an agent-based labor market simulation to examine the influence of skills on wages and unemployment rates. Therefore less and highly skilled workers as well as less and highly productive vacancies are implemented. The skill distribution is exogenous whereas the distribution of the less and highly productive vacancies is endogenous. The different opportunities of the skill groups on the labor market are established by skill requirements. This means that a highly productive vacancy can only be filled by a highly skilled unemployed. Different skill distributions, which can also be interpreted as skill-biased technological change, are simulated by incrementing the skill level of highly skilled persons exogenously. This simulation also provides a microeconomic foundation of the matching function often used in theoretical approaches.

  19. Security Analysis of Selected AMI Failure Scenarios Using Agent Based Game Theoretic Simulation

    SciTech Connect

    Abercrombie, Robert K; Schlicher, Bob G; Sheldon, Frederick T

    2014-01-01

    Information security analysis can be performed using game theory implemented in dynamic Agent Based Game Theoretic (ABGT) simulations. Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. We concentrated our analysis on the Advanced Metering Infrastructure (AMI) functional domain which the National Electric Sector Cyber security Organization Resource (NESCOR) working group has currently documented 29 failure scenarios. The strategy for the game was developed by analyzing five electric sector representative failure scenarios contained in the AMI functional domain. From these five selected scenarios, we characterize them into three specific threat categories affecting confidentiality, integrity and availability (CIA). The analysis using our ABGT simulation demonstrates how to model the AMI functional domain using a set of rationalized game theoretic rules decomposed from the failure scenarios in terms of how those scenarios might impact the AMI network with respect to CIA.

  20. Recent Advances in Agent-Based Tsunami Evacuation Simulations: Case Studies in Indonesia, Thailand, Japan and Peru

    NASA Astrophysics Data System (ADS)

    Mas, Erick; Koshimura, Shunichi; Imamura, Fumihiko; Suppasri, Anawat; Muhari, Abdul; Adriano, Bruno

    2015-12-01

    As confirmed by the extreme tsunami events over the last decade (the 2004 Indian Ocean, 2010 Chile and 2011 Japan tsunami events), mitigation measures and effective evacuation planning are needed to reduce disaster risks. Modeling tsunami evacuations is an alternative means to analyze evacuation plans and possible scenarios of evacuees' behaviors. In this paper, practical applications of an agent-based tsunami evacuation model are presented to demonstrate the contributions that agent-based modeling has added to tsunami evacuation simulations and tsunami mitigation efforts. A brief review of previous agent-based evacuation models in the literature is given to highlight recent progress in agent-based methods. Finally, challenges are noted for bridging gaps between geoscience and social science within the agent-based approach for modeling tsunami evacuations.

  1. Bridging the gap: From computational agent-based models to analytical tractability

    NASA Astrophysics Data System (ADS)

    Dyson, Louise; Lafuerza, Luis F.; McKane, Alan J.; Edmonds, Bruce

    2014-03-01

    In order to investigate complex inter-dependent systems such as those found in the biological and social sciences, one is often left trying to examine complicated, descriptive models. To aid in understanding these it would be helpful to develop tools for examining how these relate to simpler models with understandable and analysable mechanisms. We describe a way of analysing the formation of a social network in a complex computational model that represents voting patterns in a population of agents who may live, work and form friendships together. Once the network is formed, we examine the spread of ``intention to vote'' and compare our findings with those found in the descriptive, agent-based model.

  2. Modeling the Information Age Combat Model: An Agent-Based Simulation of Network Centric Operations

    NASA Technical Reports Server (NTRS)

    Deller, Sean; Rabadi, Ghaith A.; Bell, Michael I.; Bowling, Shannon R.; Tolk, Andreas

    2010-01-01

    The Information Age Combat Model (IACM) was introduced by Cares in 2005 to contribute to the development of an understanding of the influence of connectivity on force effectiveness that can eventually lead to quantitative prediction and guidelines for design and employment. The structure of the IACM makes it clear that the Perron-Frobenius Eigenvalue is a quantifiable metric with which to measure the organization of a networked force. The results of recent experiments presented in Deller, et aI., (2009) indicate that the value of the Perron-Frobenius Eigenvalue is a significant measurement of the performance of an Information Age combat force. This was accomplished through the innovative use of an agent-based simulation to model the IACM and represents an initial contribution towards a new generation of combat models that are net-centric instead of using the current platform-centric approach. This paper describes the intent, challenges, design, and initial results of this agent-based simulation model.

  3. A Multi Agent-Based Framework for Simulating Household PHEV Distribution and Electric Distribution Network Impact

    SciTech Connect

    Cui, Xiaohui; Liu, Cheng; Kim, Hoe Kyoung; Kao, Shih-Chieh; Tuttle, Mark A; Bhaduri, Budhendra L

    2011-01-01

    The variation of household attributes such as income, travel distance, age, household member, and education for different residential areas may generate different market penetration rates for plug-in hybrid electric vehicle (PHEV). Residential areas with higher PHEV ownership could increase peak electric demand locally and require utilities to upgrade the electric distribution infrastructure even though the capacity of the regional power grid is under-utilized. Estimating the future PHEV ownership distribution at the residential household level can help us understand the impact of PHEV fleet on power line congestion, transformer overload and other unforeseen problems at the local residential distribution network level. It can also help utilities manage the timing of recharging demand to maximize load factors and utilization of existing distribution resources. This paper presents a multi agent-based simulation framework for 1) modeling spatial distribution of PHEV ownership at local residential household level, 2) discovering PHEV hot zones where PHEV ownership may quickly increase in the near future, and 3) estimating the impacts of the increasing PHEV ownership on the local electric distribution network with different charging strategies. In this paper, we use Knox County, TN as a case study to show the simulation results of the agent-based model (ABM) framework. However, the framework can be easily applied to other local areas in the US.

  4. Multi-Agent-Based Simulation of a Complex Ecosystem of Mental Health Care.

    PubMed

    Kalton, Alan; Falconer, Erin; Docherty, John; Alevras, Dimitris; Brann, David; Johnson, Kyle

    2016-02-01

    This paper discusses the creation of an Agent-Based Simulation that modeled the introduction of care coordination capabilities into a complex system of care for patients with Serious and Persistent Mental Illness. The model describes the engagement between patients and the medical, social and criminal justice services they interact with in a complex ecosystem of care. We outline the challenges involved in developing the model, including process mapping and the collection and synthesis of data to support parametric estimates, and describe the controls built into the model to support analysis of potential changes to the system. We also describe the approach taken to calibrate the model to an observable level of system performance. Preliminary results from application of the simulation are provided to demonstrate how it can provide insights into potential improvements deriving from introduction of care coordination technology. PMID:26590977

  5. Promoting Conceptual Change for Complex Systems Understanding: Outcomes of an Agent-Based Participatory Simulation

    NASA Astrophysics Data System (ADS)

    Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.

    2016-08-01

    Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence ( r = .26, p = .03), Order ( r = .37, p = .002), and Tradeoffs ( r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.

  6. Promoting Conceptual Change for Complex Systems Understanding: Outcomes of an Agent-Based Participatory Simulation

    NASA Astrophysics Data System (ADS)

    Rates, Christopher A.; Mulvey, Bridget K.; Feldon, David F.

    2016-03-01

    Components of complex systems apply across multiple subject areas, and teaching these components may help students build unifying conceptual links. Students, however, often have difficulty learning these components, and limited research exists to understand what types of interventions may best help improve understanding. We investigated 32 high school students' understandings of complex systems components and whether an agent-based simulation could improve their understandings. Pretest and posttest essays were coded for changes in six components to determine whether students showed more expert thinking about the complex system of the Chesapeake Bay watershed. Results showed significant improvement for the components Emergence (r = .26, p = .03), Order (r = .37, p = .002), and Tradeoffs (r = .44, p = .001). Implications include that the experiential nature of the simulation has the potential to support conceptual change for some complex systems components, presenting a promising option for complex systems instruction.

  7. From Agents to Continuous Change via Aesthetics: Learning Mechanics with Visual Agent-Based Computational Modeling

    ERIC Educational Resources Information Center

    Sengupta, Pratim; Farris, Amy Voss; Wright, Mason

    2012-01-01

    Novice learners find motion as a continuous process of change challenging to understand. In this paper, we present a pedagogical approach based on agent-based, visual programming to address this issue. Integrating agent-based programming, in particular, Logo programming, with curricular science has been shown to be challenging in previous research…

  8. The Agent-based Approach: A New Direction for Computational Models of Development.

    ERIC Educational Resources Information Center

    Schlesinger, Matthew; Parisi, Domenico

    2001-01-01

    Introduces the concepts of online and offline sampling and highlights the role of online sampling in agent-based models of learning and development. Compares the strengths of each approach for modeling particular developmental phenomena and research questions. Describes a recent agent-based model of infant causal perception. Discusses limitations…

  9. Agent-Based Crowd Simulation Considering Emotion Contagion for Emergency Evacuation Problem

    NASA Astrophysics Data System (ADS)

    Faroqi, H.; Mesgari, M.-S.

    2015-12-01

    During emergencies, emotions greatly affect human behaviour. For more realistic multi-agent systems in simulations of emergency evacuations, it is important to incorporate emotions and their effects on the agents. In few words, emotional contagion is a process in which a person or group influences the emotions or behavior of another person or group through the conscious or unconscious induction of emotion states and behavioral attitudes. In this study, we simulate an emergency situation in an open square area with three exits considering Adults and Children agents with different behavior. Also, Security agents are considered in order to guide Adults and Children for finding the exits and be calm. Six levels of emotion levels are considered for each agent in different scenarios and situations. The agent-based simulated model initialize with the random scattering of agent populations and then when an alarm occurs, each agent react to the situation based on its and neighbors current circumstances. The main goal of each agent is firstly to find the exit, and then help other agents to find their ways. Numbers of exited agents along with their emotion levels and damaged agents are compared in different scenarios with different initialization in order to evaluate the achieved results of the simulated model. NetLogo 5.2 is used as the multi-agent simulation framework with R language as the developing language.

  10. Changing crops in response to climate: virtual Nang Rong, Thailand in an agent based simulation

    PubMed Central

    Malanson, George P.; Verdery, Ashton M.; Walsh, Stephen J.; Sawangdee, Yothin; Heumann, Benjamin W.; McDaniel, Philip M.; Frizzelle, Brian G.; Williams, Nathalie E.; Yao, Xiaozheng; Entwisle, Barbara; Rindfuss, Ronald R.

    2014-01-01

    The effects of extended climatic variability on agricultural land use were explored for the type of system found in villages of northeastern Thailand. An agent based model developed for the Nang Rong district was used to simulate land allotted to jasmine rice, heavy rice, cassava, and sugar cane. The land use choices in the model depended on likely economic outcomes, but included elements of bounded rationality in dependence on household demography. The socioeconomic dynamics are endogenous in the system, and climate changes were added as exogenous drivers. Villages changed their agricultural effort in many different ways. Most villages reduced the amount of land under cultivation, primarily with reduction in jasmine rice, but others did not. The variation in responses to climate change indicates potential sensitivity to initial conditions and path dependence for this type of system. The differences between our virtual villages and the real villages of the region indicate effects of bounded rationality and limits on model applications. PMID:25061240

  11. Agent-Based Simulation for Interconnection-Scale Renewable Integration and Demand Response Studies

    DOE PAGESBeta

    Chassin, David P.; Behboodi, Sahand; Crawford, Curran; Djilali, Ned

    2015-12-23

    This paper collects and synthesizes the technical requirements, implementation, and validation methods for quasi-steady agent-based simulations of interconnectionscale models with particular attention to the integration of renewable generation and controllable loads. Approaches for modeling aggregated controllable loads are presented and placed in the same control and economic modeling framework as generation resources for interconnection planning studies. Model performance is examined with system parameters that are typical for an interconnection approximately the size of the Western Electricity Coordinating Council (WECC) and a control area about 1/100 the size of the system. These results are used to demonstrate and validate the methodsmore » presented.« less

  12. Prediction Markets and Beliefs about Climate: Results from Agent-Based Simulations

    NASA Astrophysics Data System (ADS)

    Gilligan, J. M.; John, N. J.; van der Linden, M.

    2015-12-01

    Climate scientists have long been frustrated by persistent doubts a large portion of the public expresses toward the scientific consensus about anthropogenic global warming. The political and ideological polarization of this doubt led Vandenbergh, Raimi, and Gilligan [1] to propose that prediction markets for climate change might influence the opinions of those who mistrust the scientific community but do trust the power of markets.We have developed an agent-based simulation of a climate prediction market in which traders buy and sell future contracts that will pay off at some future year with a value that depends on the global average temperature at that time. The traders form a heterogeneous population with different ideological positions, different beliefs about anthropogenic global warming, and different degrees of risk aversion. We also vary characteristics of the market, including the topology of social networks among the traders, the number of traders, and the completeness of the market. Traders adjust their beliefs about climate according to the gains and losses they and other traders in their social network experience. This model predicts that if global temperature is predominantly driven by greenhouse gas concentrations, prediction markets will cause traders' beliefs to converge toward correctly accepting anthropogenic warming as real. This convergence is largely independent of the structure of the market and the characteristics of the population of traders. However, it may take considerable time for beliefs to converge. Conversely, if temperature does not depend on greenhouse gases, the model predicts that traders' beliefs will not converge. We will discuss the policy-relevance of these results and more generally, the use of agent-based market simulations for policy analysis regarding climate change, seasonal agricultural weather forecasts, and other applications.[1] MP Vandenbergh, KT Raimi, & JM Gilligan. UCLA Law Rev. 61, 1962 (2014).

  13. An agent-based simulation model to study accountable care organizations.

    PubMed

    Liu, Pai; Wu, Shinyi

    2016-03-01

    Creating accountable care organizations (ACOs) has been widely discussed as a strategy to control rapidly rising healthcare costs and improve quality of care; however, building an effective ACO is a complex process involving multiple stakeholders (payers, providers, patients) with their own interests. Also, implementation of an ACO is costly in terms of time and money. Immature design could cause safety hazards. Therefore, there is a need for analytical model-based decision-support tools that can predict the outcomes of different strategies to facilitate ACO design and implementation. In this study, an agent-based simulation model was developed to study ACOs that considers payers, healthcare providers, and patients as agents under the shared saving payment model of care for congestive heart failure (CHF), one of the most expensive causes of sometimes preventable hospitalizations. The agent-based simulation model has identified the critical determinants for the payment model design that can motivate provider behavior changes to achieve maximum financial and quality outcomes of an ACO. The results show nonlinear provider behavior change patterns corresponding to changes in payment model designs. The outcomes vary by providers with different quality or financial priorities, and are most sensitive to the cost-effectiveness of CHF interventions that an ACO implements. This study demonstrates an increasingly important method to construct a healthcare system analytics model that can help inform health policy and healthcare management decisions. The study also points out that the likely success of an ACO is interdependent with payment model design, provider characteristics, and cost and effectiveness of healthcare interventions. PMID:24715674

  14. Design of a Mobile Agent-Based Adaptive Communication Middleware for Federations of Critical Infrastructure Simulations

    NASA Astrophysics Data System (ADS)

    Görbil, Gökçe; Gelenbe, Erol

    The simulation of critical infrastructures (CI) can involve the use of diverse domain specific simulators that run on geographically distant sites. These diverse simulators must then be coordinated to run concurrently in order to evaluate the performance of critical infrastructures which influence each other, especially in emergency or resource-critical situations. We therefore describe the design of an adaptive communication middleware that provides reliable and real-time one-to-one and group communications for federations of CI simulators over a wide-area network (WAN). The proposed middleware is composed of mobile agent-based peer-to-peer (P2P) overlays, called virtual networks (VNets), to enable resilient, adaptive and real-time communications over unreliable and dynamic physical networks (PNets). The autonomous software agents comprising the communication middleware monitor their performance and the underlying PNet, and dynamically adapt the P2P overlay and migrate over the PNet in order to optimize communications according to the requirements of the federation and the current conditions of the PNet. Reliable communications is provided via redundancy within the communication middleware and intelligent migration of agents over the PNet. The proposed middleware integrates security methods in order to protect the communication infrastructure against attacks and provide privacy and anonymity to the participants of the federation. Experiments with an initial version of the communication middleware over a real-life networking testbed show that promising improvements can be obtained for unicast and group communications via the agent migration capability of our middleware.

  15. Investigating the role of water in the Diffusion of Cholera using Agent-Based simulation

    NASA Astrophysics Data System (ADS)

    Augustijn, Ellen-Wien; Doldersum, Tom; Augustijn, Denie

    2014-05-01

    Traditionally, cholera was considered to be a waterborne disease. Currently we know that many other factors can contribute to the spread of this disease including human mobility and human behavior. However, the hydrological component in cholera diffusion is significant. The interplay between cholera and water includes bacteria (V. cholera) that survive in the aquatic environment, the possibility that run-off water from dumpsites carries the bacteria to surface water (rivers and lakes), and when the bacteria reach streams they can be carried downstream to infect new locations. Modelling is a very important tool to build theory on the interplay between different types of transmission mechanisms that together are responsible for the spread of Cholera. Agent-based simulation models are very suitable to incorporate behavior at individual level and to reproduce emergence. However, it is more difficult to incorporate the hydrological components in this type of model. In this research we present the hydrological component of an Agent-Based Cholera model developed to study a Cholera epidemic in Kumasi (Ghana) in 2005. The model was calibrated on the relative contribution of each community to the distributed pattern of cholera rather than the absolute number of incidences. Analysis of the results shows that water plays an important role in the diffusion of cholera: 75% of the cholera cases were infected via river water that was contaminated by runoff from the dumpsites. To initiate infections upstream, the probability of environment-to-human transmission seemed to be overestimated compared to what may be expected from literature. Scenario analyses show that there is a strong relation between the epidemic curve and the rainfall. Removing dumpsites that are situated close to the river resulted in a strong decrease in the number of cholera cases. Results are sensitive to the scheduling of the daily activities and the survival time of the cholera bacteria.

  16. Real-Time Agent-Based Modeling Simulation with in-situ Visualization of Complex Biological Systems

    PubMed Central

    Seekhao, Nuttiiya; Shung, Caroline; JaJa, Joseph; Mongeau, Luc; Li-Jessen, Nicole Y. K.

    2016-01-01

    We present an efficient and scalable scheme for implementing agent-based modeling (ABM) simulation with In Situ visualization of large complex systems on heterogeneous computing platforms. The scheme is designed to make optimal use of the resources available on a heterogeneous platform consisting of a multicore CPU and a GPU, resulting in minimal to no resource idle time. Furthermore, the scheme was implemented under a client-server paradigm that enables remote users to visualize and analyze simulation data as it is being generated at each time step of the model. Performance of a simulation case study of vocal fold inflammation and wound healing with 3.8 million agents shows 35× and 7× speedup in execution time over single-core and multi-core CPU respectively. Each iteration of the model took less than 200 ms to simulate, visualize and send the results to the client. This enables users to monitor the simulation in real-time and modify its course as needed. PMID:27547508

  17. Agent-Based Spatiotemporal Simulation of Biomolecular Systems within the Open Source MASON Framework

    PubMed Central

    Pérez-Rodríguez, Gael; Pérez-Pérez, Martín; Glez-Peña, Daniel; Azevedo, Nuno F.; Lourenço, Anália

    2015-01-01

    Agent-based modelling is being used to represent biological systems with increasing frequency and success. This paper presents the implementation of a new tool for biomolecular reaction modelling in the open source Multiagent Simulator of Neighborhoods framework. The rationale behind this new tool is the necessity to describe interactions at the molecular level to be able to grasp emergent and meaningful biological behaviour. We are particularly interested in characterising and quantifying the various effects that facilitate biocatalysis. Enzymes may display high specificity for their substrates and this information is crucial to the engineering and optimisation of bioprocesses. Simulation results demonstrate that molecule distributions, reaction rate parameters, and structural parameters can be adjusted separately in the simulation allowing a comprehensive study of individual effects in the context of realistic cell environments. While higher percentage of collisions with occurrence of reaction increases the affinity of the enzyme to the substrate, a faster reaction (i.e., turnover number) leads to a smaller number of time steps. Slower diffusion rates and molecular crowding (physical hurdles) decrease the collision rate of reactants, hence reducing the reaction rate, as expected. Also, the random distribution of molecules affects the results significantly. PMID:25874228

  18. Evaluation of wholesale electric power market rules and financial risk management by agent-based simulations

    NASA Astrophysics Data System (ADS)

    Yu, Nanpeng

    As U.S. regional electricity markets continue to refine their market structures, designs and rules of operation in various ways, two critical issues are emerging. First, although much experience has been gained and costly and valuable lessons have been learned, there is still a lack of a systematic platform for evaluation of the impact of a new market design from both engineering and economic points of view. Second, the transition from a monopoly paradigm characterized by a guaranteed rate of return to a competitive market created various unfamiliar financial risks for various market participants, especially for the Investor Owned Utilities (IOUs) and Independent Power Producers (IPPs). This dissertation uses agent-based simulation methods to tackle the market rules evaluation and financial risk management problems. The California energy crisis in 2000-01 showed what could happen to an electricity market if it did not go through a comprehensive and rigorous testing before its implementation. Due to the complexity of the market structure, strategic interaction between the participants, and the underlying physics, it is difficult to fully evaluate the implications of potential changes to market rules. This dissertation presents a flexible and integrative method to assess market designs through agent-based simulations. Realistic simulation scenarios on a 225-bus system are constructed for evaluation of the proposed PJM-like market power mitigation rules of the California electricity market. Simulation results show that in the absence of market power mitigation, generation company (GenCo) agents facilitated by Q-learning are able to exploit the market flaws and make significantly higher profits relative to the competitive benchmark. The incorporation of PJM-like local market power mitigation rules is shown to be effective in suppressing the exercise of market power. The importance of financial risk management is exemplified by the recent financial crisis. In this

  19. Simulation of avascular tumor growth by agent-based game model involving phenotype-phenotype interactions.

    PubMed

    Chen, Yong; Wang, Hengtong; Zhang, Jiangang; Chen, Ke; Li, Yumin

    2015-01-01

    All tumors, both benign and metastatic, undergo an avascular growth stage with nutrients supplied by the surrounding tissue. This avascular growth process is much easier to carry out in more qualitative and quantitative experiments starting from tumor spheroids in vitro with reliable reproducibility. Essentially, this tumor progression would be described as a sequence of phenotypes. Using agent-based simulation in a two-dimensional spatial lattice, we constructed a composite growth model in which the phenotypic behavior of tumor cells depends on not only the local nutrient concentration and cell count but also the game among cells. Our simulation results demonstrated that in silico tumors are qualitatively similar to those observed in tumor spheroid experiments. We also found that the payoffs in the game between two living cell phenotypes can influence the growth velocity and surface roughness of tumors at the same time. Finally, this current model is flexible and can be easily extended to discuss other situations, such as environmental heterogeneity and mutation. PMID:26648395

  20. Simulation of avascular tumor growth by agent-based game model involving phenotype-phenotype interactions

    PubMed Central

    Chen, Yong; Wang, Hengtong; Zhang, Jiangang; Chen, Ke; Li, Yumin

    2015-01-01

    All tumors, both benign and metastatic, undergo an avascular growth stage with nutrients supplied by the surrounding tissue. This avascular growth process is much easier to carry out in more qualitative and quantitative experiments starting from tumor spheroids in vitro with reliable reproducibility. Essentially, this tumor progression would be described as a sequence of phenotypes. Using agent-based simulation in a two-dimensional spatial lattice, we constructed a composite growth model in which the phenotypic behavior of tumor cells depends on not only the local nutrient concentration and cell count but also the game among cells. Our simulation results demonstrated that in silico tumors are qualitatively similar to those observed in tumor spheroid experiments. We also found that the payoffs in the game between two living cell phenotypes can influence the growth velocity and surface roughness of tumors at the same time. Finally, this current model is flexible and can be easily extended to discuss other situations, such as environmental heterogeneity and mutation. PMID:26648395

  1. An agent-based simulation of extirpation of Ceratitis capitata applied to invasions in California.

    PubMed

    Manoukis, Nicholas C; Hoffman, Kevin

    2014-01-01

    We present an agent-based simulation (ABS) of Ceratitis capitata ("Medfly") developed for estimating the time to extirpation of this pest in areas where quarantines and eradication treatments were immediately imposed. We use the ABS, implemented in the program MED-FOES, to study seven different outbreaks that occurred in Southern California from 2008 to 2010. Results are compared with the length of intervention and quarantine imposed by the State, based on a linear developmental model (thermal unit accumulation, or "degree-day"). MED-FOES is a useful tool for invasive species managers as it incorporates more information from the known biology of the Medfly, and includes the important feature of being demographically explicit, providing significant improvements over simple degree-day calculations. While there was general agreement between the length of quarantine by degree-day and the time to extirpation indicated by MED-FOES, the ABS suggests that the margin of safety varies among cases and that in two cases the quarantine may have been excessively long. We also examined changes in the number of individuals over time in MED-FOES and conducted a sensitivity analysis for one of the outbreaks to explore the role of various input parameters on simulation outcomes. While our implementation of the ABS in this work is motivated by C. capitata and takes extirpation as a postulate, the simulation is very flexible and can be used to study a variety of questions on the invasion biology of pest insects and methods proposed to manage or eradicate such species. PMID:24563646

  2. A spatial agent-based model for the simulation of adults' daily walking within a city.

    PubMed

    Yang, Yong; Diez Roux, Ana V; Auchincloss, Amy H; Rodriguez, Daniel A; Brown, Daniel G

    2011-03-01

    Environmental effects on walking behavior have received attention in recent years because of the potential for policy interventions to increase population levels of walking. Most epidemiologic studies describe associations of walking behavior with environmental features. These analyses ignore the dynamic processes that shape walking behaviors. A spatial agent-based model (ABM) was developed to simulate people's walking behaviors within a city. Each individual was assigned properties such as age, SES, walking ability, attitude toward walking and a home location. Individuals perform different activities on a regular basis such as traveling for work, for basic needs, and for leisure. Whether an individual walks and the amount she or he walks is a function of distance to different activities and her/his walking ability and attitude toward walking. An individual's attitude toward walking evolves over time as a function of past experiences, walking of others along the walking route, limits on distances walked per day, and attitudes toward walking of the other individuals within her/his social network. The model was calibrated and used to examine the contributions of land use and safety to socioeconomic differences in walking. With further refinement and validation, ABMs may help to better understand the determinants of walking and identify the most promising interventions to increase walking. PMID:21335269

  3. A Spatial Agent-Based Model for the Simulation of Adults’ Daily Walking Within a City

    PubMed Central

    Yang, Yong; Roux, Ana V. Diez; Auchincloss, Amy H.; Rodriguez, Daniel A.; Brown, Daniel G.

    2012-01-01

    Environmental effects on walking behavior have received attention in recent years because of the potential for policy interventions to increase population levels of walking. Most epidemiologic studies describe associations of walking behavior with environmental features. These analyses ignore the dynamic processes that shape walking behaviors. A spatial agent-based model (ABM) was developed to simulate peoples’ walking behaviors within a city. Each individual was assigned properties such as age, SES, walking ability, attitude toward walking and a home location. Individuals perform different activities on a regular basis such as traveling for work, for shopping, and for recreation. Whether an individual walks and the amount she or he walks is a function distance to different activities and her or his walking ability and attitude toward walking. An individual’s attitude toward walking evolves over time as a function of past experiences, walking of others along the walking route, limits on distances walked per day, and attitudes toward walking of the other individuals within her/his social network. The model was calibrated and used to examine the contributions of land use and safety to socioeconomic differences in walking. With further refinement and validation, ABMs may help to better understand the determinants of walking and identify the most promising interventions to increase walking. PMID:21335269

  4. An Agent-based Simulation Model for C. difficile Infection Control

    PubMed Central

    Codella, James; Safdar, Nasia; Heffernan, Rick; Alagoz, Oguzhan

    2014-01-01

    Background. Control of C. difficile infection (CDI) is an increasingly difficult problem for healthcare institutions. There are commonly recommended strategies to combat CDI transmission such as oral vancomycin for CDI treatment, increased hand hygiene with soap and water for healthcare workers, daily environmental disinfection of infected patient rooms, and contact isolation of diseased patients. However, the efficacy of these strategies, particularly for endemic CDI, has not been well studied. The objective of this research is to develop a valid agent-based simulation model (ABM) to study C. difficile transmission and control in a mid-sized hospital. Methods. We develop an ABM of a mid-sized hospital with agents such as patients, healthcare workers, and visitors. We model the natural progression of CDI in a patient using a Markov chain and the transmission of CDI through agent and environmental interactions. We derive input parameters from aggregate patient data from the 2007-2010 Wisconsin Hospital Association and published medical literature. We define a calibration process, which we use to estimate transition probabilities of the Markov model by comparing simulation results to benchmark values found in published literature. Results. Comparing CDI control strategies implemented individually, routine bleach disinfection of CDI+ patient rooms provides the largest reduction in nosocomial asymptomatic colonizations (21.8%) and nosocomial CDIs (42.8%). Additionally, vancomycin treatment provides the largest reduction in relapse CDIs (41.9%), CDI-related mortalities (68.5%), and total patient LOS (21.6%). Conclusion. We develop a generalized ABM for CDI control that can be customized and further expanded to specific institutions and/or scenarios. Additionally, we estimate transition probabilities for a Markov model of natural CDI progression in a patient through calibration. PMID:25112595

  5. Agent-based evacuation simulation for spatial allocation assessment of urban shelters

    NASA Astrophysics Data System (ADS)

    Yu, Jia; Wen, Jiahong; Jiang, Yong

    2015-12-01

    The construction of urban shelters is one of the most important work in urban planning and disaster prevention. The spatial allocation assessment is a fundamental pre-step for spatial location-allocation of urban shelters. This paper introduces a new method which makes use of agent-based technology to implement evacuation simulation so as to conduct dynamic spatial allocation assessment of urban shelters. The method can not only accomplish traditional geospatial evaluation for urban shelters, but also simulate the evacuation process of the residents to shelters. The advantage of utilizing this method lies into three aspects: (1) the evacuation time of each citizen from a residential building to the shelter can be estimated more reasonably; (2) the total evacuation time of all the residents in a region is able to be obtained; (3) the road congestions in evacuation in sheltering can be detected so as to take precautionary measures to prevent potential risks. In this study, three types of agents are designed: shelter agents, government agents and resident agents. Shelter agents select specified land uses as shelter candidates for different disasters. Government agents delimitate the service area of each shelter, in other words, regulate which shelter a person should take, in accordance with the administrative boundaries and road distance between the person's position and the location of the shelter. Resident agents have a series of attributes, such as ages, positions, walking speeds, and so on. They also have several behaviors, such as reducing speed when walking in the crowd, helping old people and children, and so on. Integrating these three types of agents which are correlated with each other, evacuation procedures can be simulated and dynamic allocation assessment of shelters will be achieved. A case study in Jing'an District, Shanghai, China, was conducted to demonstrate the feasibility of the method. A scenario of earthquake disaster which occurs in nighttime

  6. An extensible simulation environment and movement metrics for testing walking behavior in agent-based models

    SciTech Connect

    Paul M. Torrens; Atsushi Nara; Xun Li; Haojie Zhu; William A. Griffin; Scott B. Brown

    2012-01-01

    Human movement is a significant ingredient of many social, environmental, and technical systems, yet the importance of movement is often discounted in considering systems complexity. Movement is commonly abstracted in agent-based modeling (which is perhaps the methodological vehicle for modeling complex systems), despite the influence of movement upon information exchange and adaptation in a system. In particular, agent-based models of urban pedestrians often treat movement in proxy form at the expense of faithfully treating movement behavior with realistic agency. There exists little consensus about which method is appropriate for representing movement in agent-based schemes. In this paper, we examine popularly-used methods to drive movement in agent-based models, first by introducing a methodology that can flexibly handle many representations of movement at many different scales and second, introducing a suite of tools to benchmark agent movement between models and against real-world trajectory data. We find that most popular movement schemes do a relatively poor job of representing movement, but that some schemes may well be 'good enough' for some applications. We also discuss potential avenues for improving the representation of movement in agent-based frameworks.

  7. Age-correlated stress resistance improves fitness of yeast: support from agent-based simulations

    PubMed Central

    2014-01-01

    Background Resistance to stress is often heterogeneous among individuals within a population, which helps protect against intermittent stress (bet hedging). This is also the case for heat shock resistance in the budding yeast Saccharomyces cerevisiae. Interestingly, the resistance appears to be continuously distributed (vs. binary, switch-like) and correlated with replicative age (vs. random). Older, slower-growing cells are more resistant than younger, faster-growing ones. Is there a fitness benefit to age-correlated stress resistance? Results Here this hypothesis is explored using a simple agent-based model, which simulates a population of individual cells that grow and replicate. Cells age by accumulating damage, which lowers their growth rate. They synthesize trehalose at a metabolic cost, which helps protect against heat shock. Proteins Tsl1 and Tps3 (trehalose synthase complex regulatory subunit TSL1 and TPS3) represent the trehalose synthesis complex and they are expressed using constant, age-dependent and stochastic terms. The model was constrained by calibration and comparison to data from the literature, including individual-based observations obtained using high-throughput microscopy and flow cytometry. A heterogeneity network was developed, which highlights the predominant sources and pathways of resistance heterogeneity. To determine the best trehalose synthesis strategy, model strains with different Tsl1/Tps3 expression parameters were placed in competition in an environment with intermittent heat shocks. Conclusions For high severities and low frequencies of heat shock, the winning strain used an age-dependent bet hedging strategy, which shows that there can be a benefit to age-correlated stress resistance. The study also illustrates the utility of combining individual-based observations and modeling to understand mechanisms underlying population heterogeneity, and the effect on fitness. PMID:24529069

  8. Proposal of Classification Method of Time Series Data in International Emissions Trading Market Using Agent-based Simulation

    NASA Astrophysics Data System (ADS)

    Nakada, Tomohiro; Takadama, Keiki; Watanabe, Shigeyoshi

    This paper proposes the classification method using Bayesian analytical method to classify the time series data in the international emissions trading market depend on the agent-based simulation and compares the case with Discrete Fourier transform analytical method. The purpose demonstrates the analytical methods mapping time series data such as market price. These analytical methods have revealed the following results: (1) the classification methods indicate the distance of mapping from the time series data, it is easier the understanding and inference than time series data; (2) these methods can analyze the uncertain time series data using the distance via agent-based simulation including stationary process and non-stationary process; and (3) Bayesian analytical method can show the 1% difference description of the emission reduction targets of agent.

  9. Investigation of Simulated Trading — A multi agent based trading system for optimization purposes

    NASA Astrophysics Data System (ADS)

    Schneider, Johannes J.

    2010-07-01

    Some years ago, Bachem, Hochstättler, and Malich proposed a heuristic algorithm called Simulated Trading for the optimization of vehicle routing problems. Computational agents place buy-orders and sell-orders for customers to be handled at a virtual financial market, the prices of the orders depending on the costs of inserting the customer in the tour or for his removal. According to a proposed rule set, the financial market creates a buy-and-sell graph for the various orders in the order book, intending to optimize the overall system. Here I present a thorough investigation for the application of this algorithm to the traveling salesman problem.

  10. Simulating Land-Use Change using an Agent-Based Land Transaction Model

    NASA Astrophysics Data System (ADS)

    Bakker, M. M.; van Dijk, J.; Alam, S. J.

    2013-12-01

    In the densely populated cultural landscapes of Europe, the vast majority of all land is owned by private parties, be it farmers (the majority), nature organizations, property developers, or citizens. Therewith, the vast majority of all land-use change arises from land transactions between different owner types: successful farms expand at the expense of less successful farms, and meanwhile property developers, individual citizens, and nature organizations also actively purchase land. These land transactions are driven by specific properties of the land, by governmental policies, and by the (economic) motives of both buyers and sellers. Climate/global change can affect these drivers at various scales: at the local scale changes in hydrology can make certain land less or more desirable; at the global scale the agricultural markets will affect motives of farmers to buy or sell land; while at intermediate (e.g. provincial) scales property developers and nature conservationists may be encouraged or discouraged to purchase land. The cumulative result of all these transactions becomes manifest in changing land-use patterns, and consequent environmental responses. Within the project Climate Adaptation for Rural Areas an agent-based land-use model was developed that explores the future response of individual land users to climate change, within the context of wider global change (i.e. policy and market change). It simulates the exchange of land among farmers and between farmers and nature organizations and property developers, for a specific case study area in the east of the Netherlands. Results show that local impacts of climate change can result in a relative stagnation in the land market in waterlogged areas. Furthermore, the increase in dairying at the expense of arable cultivation - as has been observed in the area in the past - is slowing down as arable produce shows a favourable trend in the agricultural world market. Furthermore, budgets for nature managers are

  11. Impact of Different Policies on Unhealthy Dietary Behaviors in an Urban Adult Population: An Agent-Based Simulation Model

    PubMed Central

    Giabbanelli, Philippe J.; Arah, Onyebuchi A.; Zimmerman, Frederick J.

    2014-01-01

    Objectives. Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. Methods. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Results. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Conclusions. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems. PMID:24832414

  12. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    SciTech Connect

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease states in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.

  13. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGESBeta

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  14. Multi-Agent Based Simulation of Optimal Urban Land Use Allocation in the Middle Reaches of the Yangtze River, China

    NASA Astrophysics Data System (ADS)

    Zeng, Y.; Huang, W.; Jin, W.; Li, S.

    2016-06-01

    The optimization of land-use allocation is one of important approaches to achieve regional sustainable development. This study selects Chang-Zhu-Tan agglomeration as study area and proposed a new land use optimization allocation model. Using multi-agent based simulation model, the future urban land use optimization allocation was simulated in 2020 and 2030 under three different scenarios. This kind of quantitative information about urban land use optimization allocation and urban expansions in future would be of great interest to urban planning, water and land resource management, and climate change research.

  15. Evaluating environmental strategies in a textile printing and dyeing enterprise by an agent-based simulation model

    NASA Astrophysics Data System (ADS)

    Gao, Lei; Ding, Yongsheng; Li, Fang

    2013-05-01

    To improve the capabilities of saving energy and reducing pollutant emission of textile printing and dyeing (PD) industry, this article presents a novel agent-based simulation model for assessing the impacts of environmental strategies on a PD enterprise. Two typical PD enterprises in China are simulated with different modelling granularities: one is at a module level, while the other is at an enterprise level. The module-level simulation model depicts detailed production processes in a PD enterprise and evaluates five candidate strategies on their capabilities of improving energy usage and waste emission. The enterprise-level simulation model views a PD enterprise as an agent and assesses three tax strategies for waste discharge. The simulation results show that the proposed general model could be a valuable tool to explore potential solutions to saving energy and reducing waste emission in PD enterprises, after being calibrated to a real case.

  16. Using the Integration of Discrete Event and Agent-Based Simulation to Enhance Outpatient Service Quality in an Orthopedic Department.

    PubMed

    Kittipittayakorn, Cholada; Ying, Kuo-Ching

    2016-01-01

    Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606

  17. Linking Bayesian and Agent-Based Models to Simulate Complex Social-Ecological Systems in the Sonoran Desert

    NASA Astrophysics Data System (ADS)

    Pope, A.; Gimblett, R.

    2013-12-01

    Interdependencies of ecologic, hydrologic, and social systems challenge traditional approaches to natural resource management in semi-arid regions. As a complex social-ecological system, water demands in the Sonoran Desert from agricultural and urban users often conflicts with water needs for its ecologically-significant riparian corridors. To explore this system, we developed an agent-based model to simulate complex feedbacks between human decisions and environmental conditions. Cognitive mapping in conjunction with stakeholder participation produced a Bayesian model of conditional probabilities of local human decision-making processes resulting to changes in water demand. Probabilities created in the Bayesian model were incorporated into the agent-based model, so that each agent had a unique probability to make a positive decision based on its perceived environment at each point in time and space. By using a Bayesian approach, uncertainty in the human decision-making process could be incorporated. The spatially-explicit agent-based model simulated changes in depth-to-groundwater by well pumping based on an agent's water demand. Depth-to-groundwater was then used as an indicator of unique vegetation guilds within the riparian corridor. Each vegetation guild provides varying levels of ecosystem services, the changes of which, along with changes in depth-to-groundwater, feedback to influence agent behavior. Using this modeling approach allowed us to examine resilience of semi-arid riparian corridors and agent behavior under various scenarios. The insight provided by the model contributes to understanding how specific interventions may alter the complex social-ecological system in the future.

  18. Can human-like Bots control collective mood: agent-based simulations of online chats

    NASA Astrophysics Data System (ADS)

    Tadić, Bosiljka; Šuvakov, Milovan

    2013-10-01

    Using an agent-based modeling approach, in this paper, we study self-organized dynamics of interacting agents in the presence of chat Bots. Different Bots with tunable ‘human-like’ attributes, which exchange emotional messages with agents, are considered, and the collective emotional behavior of agents is quantitatively analyzed. In particular, using detrended fractal analysis we determine persistent fluctuations and temporal correlations in time series of agent activity and statistics of avalanches carrying emotional messages of agents when Bots favoring positive/negative affects are active. We determine the impact of Bots and identify parameters that can modulate that impact. Our analysis suggests that, by these measures, the emotional Bots induce collective emotion among interacting agents by suitably altering the fractal characteristics of the underlying stochastic process. Positive emotion Bots are slightly more effective than negative emotion Bots. Moreover, Bots which periodically alternate between positive and negative emotion can enhance fluctuations in the system, leading to avalanches of agent messages that are reminiscent of self-organized critical states.

  19. Acceptability of an Embodied Conversational Agent-based Computer Application for Hispanic Women

    PubMed Central

    Wells, Kristen J.; Vázquez-Otero, Coralia; Bredice, Marissa; Meade, Cathy D.; Chaet, Alexis; Rivera, Maria I.; Arroyo, Gloria; Proctor, Sara K.; Barnes, Laura E.

    2015-01-01

    There are few Spanish language interactive, technology-driven health education programs. Objectives of this feasibility study were to: 1) learn more about computer and technology usage among Hispanic women living in a rural community; and 2) evaluate acceptability of the concept of using an embodied conversational agent (ECA) computer application among this population. A survey about computer usage history and interest in computers was administered to a convenience sample of 26 women. A sample video prototype of a hospital discharge ECA was administered followed by questions to gauge opinion about the ECA. Data indicate women exhibited both a high level of computer experience and enthusiasm for the ECA. Feedback from community is essential to ensure equity in state of the art dissemination of health information. Hay algunos programas interactivos en español que usan la tecnología para educar sobre la salud. Los objetivos de este estudio fueron: 1) aprender más sobre el uso de computadoras y tecnología entre mujeres Hispanas que viven en comunidades rurales y 2) evaluar la aceptabilidad del concepto de usar un programa de computadora utilizando un agente de conversación encarnado (ECA) en esta población. Se administro una encuesta sobre el historial de uso y del interés de aprender sobre computadoras fue a 26 mujeres por muestreo de conveniencia. Un ejemplo del prototipo ECA en forma de video de un alta hospitalaria fue administrado y fue seguido por preguntas sobre la opinión que tenían del ECA. Los datos indican que las mujeres mostraron un alto nivel de experiencia con las computadoras y un alto nivel de entusiasmo sobre el ECA. La retroalimentación de la comunidad es esencial para asegurar equidad en la diseminación de información sobre la salud con tecnología de punta. PMID:26671558

  20. An Economic Analysis of Strategies to Control Clostridium Difficile Transmission and Infection Using an Agent-Based Simulation Model

    PubMed Central

    Nelson, Richard E.; Jones, Makoto; Leecaster, Molly; Samore, Matthew H.; Ray, William; Huttner, Angela; Huttner, Benedikt; Khader, Karim; Stevens, Vanessa W.; Gerding, Dale; Schweizer, Marin L.; Rubin, Michael A.

    2016-01-01

    Background A number of strategies exist to reduce Clostridium difficile (C. difficile) transmission. We conducted an economic evaluation of “bundling” these strategies together. Methods We constructed an agent-based computer simulation of nosocomial C. difficile transmission and infection in a hospital setting. This model included the following components: interactions between patients and health care workers; room contamination via C. difficile shedding; C. difficile hand carriage and removal via hand hygiene; patient acquisition of C. difficile via contact with contaminated rooms or health care workers; and patient antimicrobial use. Six interventions were introduced alone and "bundled" together: (a) aggressive C. difficile testing; (b) empiric isolation and treatment of symptomatic patients; (c) improved adherence to hand hygiene and (d) contact precautions; (e) improved use of soap and water for hand hygiene; and (f) improved environmental cleaning. Our analysis compared these interventions using values representing 3 different scenarios: (1) base-case (BASE) values that reflect typical hospital practice, (2) intervention (INT) values that represent implementation of hospital-wide efforts to reduce C. diff transmission, and (3) optimal (OPT) values representing the highest expected results from strong adherence to the interventions. Cost parameters for each intervention were obtained from published literature. We performed our analyses assuming low, normal, and high C. difficile importation prevalence and transmissibility of C. difficile. Results INT levels of the “bundled” intervention were cost-effective at a willingness-to-pay threshold of $100,000/quality-adjusted life-year in all importation prevalence and transmissibility scenarios. OPT levels of intervention were cost-effective for normal and high importation prevalence and transmissibility scenarios. When analyzed separately, hand hygiene compliance, environmental decontamination, and empiric

  1. [Research on multi-agent based modeling and simulation of hospital system].

    PubMed

    Zhao, Junping; Yang, Hongqiao; Guo, Huayuan; Li, Yi; Zhang, Zhenjiang; Li, Shuzhang

    2010-12-01

    In this paper, the theory of complex adaptive system (CAS) and its modeling method are introduced. The complex characters of the hospital system is analyzed. The agile manufacturing and cell reconstruction technologies are used to reconstruct the hospital system. Then we set forth a research for simulation of hospital system based on the methodology of Multi-Agent technology and high level architecture (HLA). Finally, a simulation framework based on HLA for hospital system is presented. PMID:21374992

  2. An agent-based framework for fuel cycle simulation with recycling

    SciTech Connect

    Gidden, M.J.; Wilson, P.P.H.; Huff, K.D.; Carlsen, R.W.

    2013-07-01

    Simulation of the nuclear fuel cycle is an established field with multiple players. Prior development work has utilized techniques such as system dynamics to provide a solution structure for the matching of supply and demand in these simulations. In general, however, simulation infrastructure development has occurred in relatively closed circles, each effort having unique considerations as to the cases which are desired to be modeled. Accordingly, individual simulators tend to have their design decisions driven by specific use cases. Presented in this work is a proposed supply and demand matching algorithm that leverages the techniques of the well-studied field of mathematical programming. A generic approach is achieved by treating facilities as individual entities and actors in the supply-demand market which denote preferences amongst commodities. Using such a framework allows for varying levels of interaction fidelity, ranging from low-fidelity, quick solutions to high-fidelity solutions that model individual transactions (e.g. at the fuel-assembly level). The power of the technique is that it allows such flexibility while still treating the problem in a generic manner, encapsulating simulation engine design decisions in such a way that future simulation requirements can be relatively easily added when needed. (authors)

  3. Agent-Based Simulations of Malaria Transmissions with Applications to a Study Site in Thailand

    NASA Technical Reports Server (NTRS)

    Kiang, Richard K.; Adimi, Farida; Zollner, Gabriela E.; Coleman, Russell E.

    2006-01-01

    The dynamics of malaria transmission are driven by environmental, biotic and socioeconomic factors. Because of the geographic dependency of these factors and the complex interactions among them, it is difficult to generalize the key factors that perpetuate or intensify malaria transmission. Methods: Discrete event simulations were used for modeling the detailed interactions among the vector life cycle, sporogonic cycle and human infection cycle, under the explicit influences of selected extrinsic and intrinsic factors. Meteorological and environmental parameters may be derived from satellite data. The output of the model includes the individual infection status and the quantities normally observed in field studies, such as mosquito biting rates, sporozoite infection rates, gametocyte prevalence and incidence. Results were compared with mosquito vector and human malaria data acquired over 4.5 years (June 1999 - January 2004) in Kong Mong Tha, a remote village in Kanchanaburi Province, western Thailand. Results: Three years of transmissions of vivax and falciparum malaria were simulated for a hypothetical hamlet with approximately 1/7 of the study site population. The model generated results for a number of scenarios, including applications of larvicide and insecticide, asymptomatic cases receiving or not receiving treatment, blocking malaria transmission in mosquito vectors, and increasing the density of farm (host) animals in the hamlet. Transmission characteristics and trends in the simulated results are comparable to actual data collected at the study site.

  4. ActivitySim: large-scale agent based activity generation for infrastructure simulation

    SciTech Connect

    Gali, Emmanuel; Eidenbenz, Stephan; Mniszewski, Sue; Cuellar, Leticia; Teuscher, Christof

    2008-01-01

    The United States' Department of Homeland Security aims to model, simulate, and analyze critical infrastructure and their interdependencies across multiple sectors such as electric power, telecommunications, water distribution, transportation, etc. We introduce ActivitySim, an activity simulator for a population of millions of individual agents each characterized by a set of demographic attributes that is based on US census data. ActivitySim generates daily schedules for each agent that consists of a sequence of activities, such as sleeping, shopping, working etc., each being scheduled at a geographic location, such as businesses or private residences that is appropriate for the activity type and for the personal situation of the agent. ActivitySim has been developed as part of a larger effort to understand the interdependencies among national infrastructure networks and their demand profiles that emerge from the different activities of individuals in baseline scenarios as well as emergency scenarios, such as hurricane evacuations. We present the scalable software engineering principles underlying ActivitySim, the socia-technical modeling paradigms that drive the activity generation, and proof-of-principle results for a scenario in the Twin Cities, MN area of 2.6 M agents.

  5. The contribution of agent-based simulations to conservation management on a Natura 2000 site.

    PubMed

    Dupont, Hélène; Gourmelon, Françoise; Rouan, Mathias; Le Viol, Isabelle; Kerbiriou, Christian

    2016-03-01

    The conservation of biodiversity today must include the participation and support of local stakeholders. Natura 2000 can be considered as a conservation system that, in its application in most EU countries, relies on the participation of local stakeholders. Our study proposes a scientific method for participatory modelling, with the aim of contributing to the conservation management of habitats and species at a Natura 2000 site (Crozon Peninsula, Bretagne, France) that is representative of in landuse changes in coastal areas. We make use of companion modelling and its associated tools (scenario-planning, GIS, multi-agent modelling and simulations) to consider possible futures through the co-construction of management scenarios and the understanding of their consequences on different indicators of biodiversity status (habitats, avifauna, flora). The maintenance of human activities as they have been carried out since the creation of the Natura 2000s zone allows the biodiversity values to remain stable. Extensive agricultural activities have been shown to be essential to this maintenance, whereas management sustained by the multiplication of conservation actions brings about variable results according to the indicators. None of the scenarios has a positive incidence on the set of indicators. However, an understanding of the modelling system and the results of the simulations allow for the refining of the selection of conservation actions in relation to the species to be preserved. PMID:26696603

  6. Evolutionary Agent-Based Simulation of the Introduction of New Technologies in Air Traffic Management

    NASA Technical Reports Server (NTRS)

    Yliniemi, Logan; Agogino, Adrian K.; Tumer, Kagan

    2014-01-01

    Accurate simulation of the effects of integrating new technologies into a complex system is critical to the modernization of our antiquated air traffic system, where there exist many layers of interacting procedures, controls, and automation all designed to cooperate with human operators. Additions of even simple new technologies may result in unexpected emergent behavior due to complex human/ machine interactions. One approach is to create high-fidelity human models coming from the field of human factors that can simulate a rich set of behaviors. However, such models are difficult to produce, especially to show unexpected emergent behavior coming from many human operators interacting simultaneously within a complex system. Instead of engineering complex human models, we directly model the emergent behavior by evolving goal directed agents, representing human users. Using evolution we can predict how the agent representing the human user reacts given his/her goals. In this paradigm, each autonomous agent in a system pursues individual goals, and the behavior of the system emerges from the interactions, foreseen or unforeseen, between the agents/actors. We show that this method reflects the integration of new technologies in a historical case, and apply the same methodology for a possible future technology.

  7. An Agent-Based Simulation for Investigating the Impact of Stereotypes on Task-Oriented Group Formation

    NASA Astrophysics Data System (ADS)

    Maghami, Mahsa; Sukthankar, Gita

    In this paper, we introduce an agent-based simulation for investigating the impact of social factors on the formation and evolution of task-oriented groups. Task-oriented groups are created explicitly to perform a task, and all members derive benefits from task completion. However, even in cases when all group members act in a way that is locally optimal for task completion, social forces that have mild effects on choice of associates can have a measurable impact on task completion performance. In this paper, we show how our simulation can be used to model the impact of stereotypes on group formation. In our simulation, stereotypes are based on observable features, learned from prior experience, and only affect an agent's link formation preferences. Even without assuming stereotypes affect the agents' willingness or ability to complete tasks, the long-term modifications that stereotypes have on the agents' social network impair the agents' ability to form groups with sufficient diversity of skills, as compared to agents who form links randomly. An interesting finding is that this effect holds even in cases where stereotype preference and skill existence are completely uncorrelated.

  8. Exploring Tradeoffs in Demand-side and Supply-side Management of Urban Water Resources using Agent-based Modeling and Evolutionary Computation

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Berglund, E. Z.

    2015-12-01

    Urban water supply systems may be managed through supply-side and demand-side strategies, which focus on water source expansion and demand reductions, respectively. Supply-side strategies bear infrastructure and energy costs, while demand-side strategies bear costs of implementation and inconvenience to consumers. To evaluate the performance of demand-side strategies, the participation and water use adaptations of consumers should be simulated. In this study, a Complex Adaptive Systems (CAS) framework is developed to simulate consumer agents that change their consumption to affect the withdrawal from the water supply system, which, in turn influences operational policies and long-term resource planning. Agent-based models are encoded to represent consumers and a policy maker agent and are coupled with water resources system simulation models. The CAS framework is coupled with an evolutionary computation-based multi-objective methodology to explore tradeoffs in cost, inconvenience to consumers, and environmental impacts for both supply-side and demand-side strategies. Decisions are identified to specify storage levels in a reservoir that trigger (1) increases in the volume of water pumped through inter-basin transfers from an external reservoir and (2) drought stages, which restrict the volume of water that is allowed for residential outdoor uses. The proposed methodology is demonstrated for Arlington, Texas, water supply system to identify non-dominated strategies for an historic drought decade. Results demonstrate that pumping costs associated with maximizing environmental reliability exceed pumping costs associated with minimizing restrictions on consumer water use.

  9. Modeling the Internet of Things, Self-Organizing and Other Complex Adaptive Communication Networks: A Cognitive Agent-Based Computing Approach

    PubMed Central

    2016-01-01

    Background Computer Networks have a tendency to grow at an unprecedented scale. Modern networks involve not only computers but also a wide variety of other interconnected devices ranging from mobile phones to other household items fitted with sensors. This vision of the "Internet of Things" (IoT) implies an inherent difficulty in modeling problems. Purpose It is practically impossible to implement and test all scenarios for large-scale and complex adaptive communication networks as part of Complex Adaptive Communication Networks and Environments (CACOONS). The goal of this study is to explore the use of Agent-based Modeling as part of the Cognitive Agent-based Computing (CABC) framework to model a Complex communication network problem. Method We use Exploratory Agent-based Modeling (EABM), as part of the CABC framework, to develop an autonomous multi-agent architecture for managing carbon footprint in a corporate network. To evaluate the application of complexity in practical scenarios, we have also introduced a company-defined computer usage policy. Results The conducted experiments demonstrated two important results: Primarily CABC-based modeling approach such as using Agent-based Modeling can be an effective approach to modeling complex problems in the domain of IoT. Secondly, the specific problem of managing the Carbon footprint can be solved using a multiagent system approach. PMID:26812235

  10. An Agent-Based Model of a Hepatic Inflammatory Response to Salmonella: A Computational Study under a Large Set of Experimental Data

    PubMed Central

    Chapes, Stephen K.; Ben-Arieh, David; Wu, Chih-Hang

    2016-01-01

    We present an agent-based model (ABM) to simulate a hepatic inflammatory response (HIR) in a mouse infected by Salmonella that sometimes progressed to problematic proportions, known as “sepsis”. Based on over 200 published studies, this ABM describes interactions among 21 cells or cytokines and incorporates 226 experimental data sets and/or data estimates from those reports to simulate a mouse HIR in silico. Our simulated results reproduced dynamic patterns of HIR reported in the literature. As shown in vivo, our model also demonstrated that sepsis was highly related to the initial Salmonella dose and the presence of components of the adaptive immune system. We determined that high mobility group box-1, C-reactive protein, and the interleukin-10: tumor necrosis factor-α ratio, and CD4+ T cell: CD8+ T cell ratio, all recognized as biomarkers during HIR, significantly correlated with outcomes of HIR. During therapy-directed silico simulations, our results demonstrated that anti-agent intervention impacted the survival rates of septic individuals in a time-dependent manner. By specifying the infected species, source of infection, and site of infection, this ABM enabled us to reproduce the kinetics of several essential indicators during a HIR, observe distinct dynamic patterns that are manifested during HIR, and allowed us to test proposed therapy-directed treatments. Although limitation still exists, this ABM is a step forward because it links underlying biological processes to computational simulation and was validated through a series of comparisons between the simulated results and experimental studies. PMID:27556404

  11. An Agent-Based Model of a Hepatic Inflammatory Response to Salmonella: A Computational Study under a Large Set of Experimental Data.

    PubMed

    Shi, Zhenzhen; Chapes, Stephen K; Ben-Arieh, David; Wu, Chih-Hang

    2016-01-01

    We present an agent-based model (ABM) to simulate a hepatic inflammatory response (HIR) in a mouse infected by Salmonella that sometimes progressed to problematic proportions, known as "sepsis". Based on over 200 published studies, this ABM describes interactions among 21 cells or cytokines and incorporates 226 experimental data sets and/or data estimates from those reports to simulate a mouse HIR in silico. Our simulated results reproduced dynamic patterns of HIR reported in the literature. As shown in vivo, our model also demonstrated that sepsis was highly related to the initial Salmonella dose and the presence of components of the adaptive immune system. We determined that high mobility group box-1, C-reactive protein, and the interleukin-10: tumor necrosis factor-α ratio, and CD4+ T cell: CD8+ T cell ratio, all recognized as biomarkers during HIR, significantly correlated with outcomes of HIR. During therapy-directed silico simulations, our results demonstrated that anti-agent intervention impacted the survival rates of septic individuals in a time-dependent manner. By specifying the infected species, source of infection, and site of infection, this ABM enabled us to reproduce the kinetics of several essential indicators during a HIR, observe distinct dynamic patterns that are manifested during HIR, and allowed us to test proposed therapy-directed treatments. Although limitation still exists, this ABM is a step forward because it links underlying biological processes to computational simulation and was validated through a series of comparisons between the simulated results and experimental studies. PMID:27556404

  12. Agent-based Modeling to Simulate the Diffusion of Water-Efficient Innovations and the Emergence of Urban Water Sustainability

    NASA Astrophysics Data System (ADS)

    Kanta, L.; Giacomoni, M.; Shafiee, M. E.; Berglund, E.

    2014-12-01

    The sustainability of water resources is threatened by urbanization, as increasing demands deplete water availability, and changes to the landscape alter runoff and the flow regime of receiving water bodies. Utility managers typically manage urban water resources through the use of centralized solutions, such as large reservoirs, which may be limited in their ability balance the needs of urbanization and ecological systems. Decentralized technologies, on the other hand, may improve the health of the water resources system and deliver urban water services. For example, low impact development technologies, such as rainwater harvesting, and water-efficient technologies, such as low-flow faucets and toilets, may be adopted by households to retain rainwater and reduce demands, offsetting the need for new centralized infrastructure. Decentralized technologies may create new complexities in infrastructure and water management, as decentralization depends on community behavior and participation beyond traditional water resources planning. Messages about water shortages and water quality from peers and the water utility managers can influence the adoption of new technologies. As a result, feedbacks between consumers and water resources emerge, creating a complex system. This research develops a framework to simulate the diffusion of water-efficient innovations and the sustainability of urban water resources, by coupling models of households in a community, hydrologic models of a water resources system, and a cellular automata model of land use change. Agent-based models are developed to simulate the land use and water demand decisions of individual households, and behavioral rules are encoded to simulate communication with other agents and adoption of decentralized technologies, using a model of the diffusion of innovation. The framework is applied for an illustrative case study to simulate water resources sustainability over a long-term planning horizon.

  13. The Basic Immune Simulator: An agent-based model to study the interactions between innate and adaptive immunity

    PubMed Central

    Folcik, Virginia A; An, Gary C; Orosz, Charles G

    2007-01-01

    Background We introduce the Basic Immune Simulator (BIS), an agent-based model created to study the interactions between the cells of the innate and adaptive immune system. Innate immunity, the initial host response to a pathogen, generally precedes adaptive immunity, which generates immune memory for an antigen. The BIS simulates basic cell types, mediators and antibodies, and consists of three virtual spaces representing parenchymal tissue, secondary lymphoid tissue and the lymphatic/humoral circulation. The BIS includes a Graphical User Interface (GUI) to facilitate its use as an educational and research tool. Results The BIS was used to qualitatively examine the innate and adaptive interactions of the immune response to a viral infection. Calibration was accomplished via a parameter sweep of initial agent population size, and comparison of simulation patterns to those reported in the basic science literature. The BIS demonstrated that the degree of the initial innate response was a crucial determinant for an appropriate adaptive response. Deficiency or excess in innate immunity resulted in excessive proliferation of adaptive immune cells. Deficiency in any of the immune system components increased the probability of failure to clear the simulated viral infection. Conclusion The behavior of the BIS matches both normal and pathological behavior patterns in a generic viral infection scenario. Thus, the BIS effectively translates mechanistic cellular and molecular knowledge regarding the innate and adaptive immune response and reproduces the immune system's complex behavioral patterns. The BIS can be used both as an educational tool to demonstrate the emergence of these patterns and as a research tool to systematically identify potential targets for more effective treatment strategies for diseases processes including hypersensitivity reactions (allergies, asthma), autoimmunity and cancer. We believe that the BIS can be a useful addition to the growing suite of in

  14. Multiobjective Decision Making Policies and Coordination Mechanisms in Hierarchical Organizations: Results of an Agent-Based Simulation

    PubMed Central

    2014-01-01

    This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926

  15. Multiobjective decision making policies and coordination mechanisms in hierarchical organizations: results of an agent-based simulation.

    PubMed

    Leitner, Stephan; Wall, Friederike

    2014-01-01

    This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926

  16. Modeling the 2014 Ebola Virus Epidemic - Agent-Based Simulations, Temporal Analysis and Future Predictions for Liberia and Sierra Leone.

    PubMed

    Siettos, Constantinos; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2015-01-01

    We developed an agent-based model to investigate the epidemic dynamics of Ebola virus disease (EVD) in Liberia and Sierra Leone from May 27 to December 21, 2014. The dynamics of the agent-based simulator evolve on small-world transmission networks of sizes equal to the population of each country, with adjustable densities to account for the effects of public health intervention policies and individual behavioral responses to the evolving epidemic. Based on time series of the official case counts from the World Health Organization (WHO), we provide estimates for key epidemiological variables by employing the so-called Equation-Free approach. The underlying transmission networks were characterized by rather random structures in the two countries with densities decreasing by ~19% from the early (May 27-early August) to the last period (mid October-December 21). Our estimates for the values of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate, are very close to the ones reported by the WHO Ebola response team during the early period of the epidemic (until September 14) that were calculated based on clinical data. Specifically, regarding the effective reproductive number Re, our analysis suggests that until mid October, Re was above 2.3 in both countries; from mid October to December 21, Re dropped well below unity in Liberia, indicating a saturation of the epidemic, while in Sierra Leone it was around 1.9, indicating an ongoing epidemic. Accordingly, a ten-week projection from December 21 estimated that the epidemic will fade out in Liberia in early March; in contrast, our results flashed a note of caution for Sierra Leone since the cumulative number of cases could reach as high as 18,000, and the number of deaths might exceed 5,000, by early March 2015. However, by processing the reported data of the very last period (December 21, 2014-January 18, 2015), we obtained more optimistic estimates indicative of a remission of

  17. Understanding coupled natural and human systems on fire prone landscapes: integrating wildfire simulation into an agent based planning system.

    NASA Astrophysics Data System (ADS)

    Barros, Ana; Ager, Alan; Preisler, Haiganoush; Day, Michelle; Spies, Tom; Bolte, John

    2015-04-01

    Agent-based models (ABM) allow users to examine the long-term effects of agent decisions in complex systems where multiple agents and processes interact. This framework has potential application to study the dynamics of coupled natural and human systems where multiple stimuli determine trajectories over both space and time. We used Envision, a landscape based ABM, to analyze long-term wildfire dynamics in a heterogeneous, multi-owner landscape in Oregon, USA. Landscape dynamics are affected by land management policies, actors decisions, and autonomous processes such as vegetation succession, wildfire, or at a broader scale, climate change. Key questions include: 1) How are landscape dynamics influenced by policies and institutions, and 2) How do land management policies and actor decisions interact to produce intended and unintended consequences with respect to wildfire on fire-prone landscapes. Applying Envision to address these questions required the development of a wildfire module that could accurately simulate wildfires on the heterogeneous landscapes within the study area in terms of replicating historical fire size distribution, spatial distribution and fire intensity. In this paper we describe the development and testing of a mechanistic fire simulation system within Envision and application of the model on a 3.2 million fire prone landscape in central Oregon USA. The core fire spread equations use the Minimum Travel Time algorithm developed by M Finney. The model operates on a daily time step and uses a fire prediction system based on the relationship between energy release component and historical fires. Specifically, daily wildfire probabilities and sizes are generated from statistical analyses of historical fires in relation to daily ERC values. The MTT was coupled with the vegetation dynamics module in Envision to allow communication between the respective subsystem and effectively model fire effects and vegetation dynamics after a wildfire. Canopy and

  18. Modelling Temporal Schedule of Urban Trains Using Agent-Based Simulation and NSGA2-BASED Multiobjective Optimization Approaches

    NASA Astrophysics Data System (ADS)

    Sahelgozin, M.; Alimohammadi, A.

    2015-12-01

    Increasing distances between locations of residence and services leads to a large number of daily commutes in urban areas. Developing subway systems has been taken into consideration of transportation managers as a response to this huge amount of travel demands. In developments of subway infrastructures, representing a temporal schedule for trains is an important task; because an appropriately designed timetable decreases Total passenger travel times, Total Operation Costs and Energy Consumption of trains. Since these variables are not positively correlated, subway scheduling is considered as a multi-criteria optimization problem. Therefore, proposing a proper solution for subway scheduling has been always a controversial issue. On the other hand, research on a phenomenon requires a summarized representation of the real world that is known as Model. In this study, it is attempted to model temporal schedule of urban trains that can be applied in Multi-Criteria Subway Schedule Optimization (MCSSO) problems. At first, a conceptual framework is represented for MCSSO. Then, an agent-based simulation environment is implemented to perform Sensitivity Analysis (SA) that is used to extract the interrelations between the framework components. These interrelations is then taken into account in order to construct the proposed model. In order to evaluate performance of the model in MCSSO problems, Tehran subway line no. 1 is considered as the case study. Results of the study show that the model was able to generate an acceptable distribution of Pareto-optimal solutions which are applicable in the real situations while solving a MCSSO is the goal. Also, the accuracy of the model in representing the operation of subway systems was significant.

  19. Analysis of CDC social control measures using an agent-based simulation of an influenza epidemic in a city

    PubMed Central

    2011-01-01

    Background The transmission of infectious disease amongst the human population is a complex process which requires advanced, often individual-based, models to capture the space-time details observed in reality. Methods An Individual Space-Time Activity-based Model (ISTAM) was applied to simulate the effectiveness of non-pharmaceutical control measures including: (1) refraining from social activities, (2) school closure and (3) household quarantine, for a hypothetical influenza outbreak in an urban area. Results Amongst the set of control measures tested, refraining from social activities with various compliance levels was relatively ineffective. Household quarantine was very effective, especially for the peak number of cases and total number of cases, with large differences between compliance levels. Household quarantine resulted in a decrease in the peak number of cases from more than 300 to around 158 for a 100% compliance level, a decrease of about 48.7%. The delay in the outbreak peak was about 3 to 17 days. The total number of cases decreased to a range of 3635-5403, that is, 63.7%-94.7% of the baseline value. When coupling control measures, household quarantine together with school closure was the most effective strategy. The resulting space-time distribution of infection in different classes of activity bundles (AB) suggests that the epidemic outbreak is strengthened amongst children and then spread to adults. By sensitivity analysis, this study demonstrated that earlier implementation of control measures leads to greater efficacy. Also, for infectious diseases with larger basic reproduction number, the effectiveness of non-pharmaceutical measures was shown to be limited. Conclusions Simulated results showed that household quarantine was the most effective control measure, while school closure and household quarantine implemented together achieved the greatest benefit. Agent-based models should be applied in the future to evaluate the efficacy of control

  20. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  1. Agent Based Modeling Applications for Geosciences

    NASA Astrophysics Data System (ADS)

    Stein, J. S.

    2004-12-01

    Agent-based modeling techniques have successfully been applied to systems in which complex behaviors or outcomes arise from varied interactions between individuals in the system. Each individual interacts with its environment, as well as with other individuals, by following a set of relatively simple rules. Traditionally this "bottom-up" modeling approach has been applied to problems in the fields of economics and sociology, but more recently has been introduced to various disciplines in the geosciences. This technique can help explain the origin of complex processes from a relatively simple set of rules, incorporate large and detailed datasets when they exist, and simulate the effects of extreme events on system-wide behavior. Some of the challenges associated with this modeling method include: significant computational requirements in order to keep track of thousands to millions of agents, methods and strategies of model validation are lacking, as is a formal methodology for evaluating model uncertainty. Challenges specific to the geosciences, include how to define agents that control water, contaminant fluxes, climate forcing and other physical processes and how to link these "geo-agents" into larger agent-based simulations that include social systems such as demographics economics and regulations. Effective management of limited natural resources (such as water, hydrocarbons, or land) requires an understanding of what factors influence the demand for these resources on a regional and temporal scale. Agent-based models can be used to simulate this demand across a variety of sectors under a range of conditions and determine effective and robust management policies and monitoring strategies. The recent focus on the role of biological processes in the geosciences is another example of an area that could benefit from agent-based applications. A typical approach to modeling the effect of biological processes in geologic media has been to represent these processes in

  2. Computationally efficient multibody simulations

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  3. Modeling the transmission of community-associated methicillin-resistant Staphylococcus aureus: a dynamic agent-based simulation

    PubMed Central

    2014-01-01

    Background Methicillin-resistant Staphylococcus aureus (MRSA) has been a deadly pathogen in healthcare settings since the 1960s, but MRSA epidemiology changed since 1990 with new genetically distinct strain types circulating among previously healthy people outside healthcare settings. Community-associated (CA) MRSA strains primarily cause skin and soft tissue infections, but may also cause life-threatening invasive infections. First seen in Australia and the U.S., it is a growing problem around the world. The U.S. has had the most widespread CA-MRSA epidemic, with strain type USA300 causing the great majority of infections. Individuals with either asymptomatic colonization or infection may transmit CA-MRSA to others, largely by skin-to-skin contact. Control measures have focused on hospital transmission. Limited public health education has focused on care for skin infections. Methods We developed a fine-grained agent-based model for Chicago to identify where to target interventions to reduce CA-MRSA transmission. An agent-based model allows us to represent heterogeneity in population behavior, locations and contact patterns that are highly relevant for CA-MRSA transmission and control. Drawing on nationally representative survey data, the model represents variation in sociodemographics, locations, behaviors, and physical contact patterns. Transmission probabilities are based on a comprehensive literature review. Results Over multiple 10-year runs with one-hour ticks, our model generates temporal and geographic trends in CA-MRSA incidence similar to Chicago from 2001 to 2010. On average, a majority of transmission events occurred in households, and colonized rather than infected agents were the source of the great majority (over 95%) of transmission events. The key findings are that infected people are not the primary source of spread. Rather, the far greater number of colonized individuals must be targeted to reduce transmission. Conclusions Our findings suggest

  4. Toward a Multi-Scale Computational Model of Arterial Adaptation in Hypertension: Verification of a Multi-Cell Agent Based Model

    PubMed Central

    Thorne, Bryan C.; Hayenga, Heather N.; Humphrey, Jay D.; Peirce, Shayn M.

    2011-01-01

    Agent-based models (ABMs) represent a novel approach to study and simulate complex mechano chemo-biological responses at the cellular level. Such models have been used to simulate a variety of emergent responses in the vasculature, including angiogenesis and vasculogenesis. Although not used previously to study large vessel adaptations, we submit that ABMs will prove equally useful in such studies when combined with well-established continuum models to form multi-scale models of tissue-level phenomena. In order to couple agent-based and continuum models, however, there is a need to ensure that each model faithfully represents the best data available at the relevant scale and that there is consistency between models under baseline conditions. Toward this end, we describe the development and verification of an ABM of endothelial and smooth muscle cell responses to mechanical stimuli in a large artery. A refined rule-set is proposed based on a broad literature search, a new scoring system for assigning confidence in the rules, and a parameter sensitivity study. To illustrate the utility of these new methods for rule selection, as well as the consistency achieved with continuum-level models, we simulate the behavior of a mouse aorta during homeostasis and in response to both transient and sustained increases in pressure. The simulated responses depend on the altered cellular production of seven key mitogenic, synthetic, and proteolytic biomolecules, which in turn control the turnover of intramural cells and extracellular matrix. These events are responsible for gross changes in vessel wall morphology. This new ABM is shown to be appropriately stable under homeostatic conditions, insensitive to transient elevations in blood pressure, and responsive to increased intramural wall stress in hypertension. PMID:21720536

  5. Computer Modeling and Simulation

    SciTech Connect

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossible to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes

  6. Accelerator simulation using computers

    SciTech Connect

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a multi-track'' simulation and analysis code can be used for these applications.

  7. Accelerator simulation using computers

    SciTech Connect

    Lee, M.; Zambre, Y.; Corbett, W.

    1992-01-01

    Every accelerator or storage ring system consists of a charged particle beam propagating through a beam line. Although a number of computer programs exits that simulate the propagation of a beam in a given beam line, only a few provide the capabilities for designing, commissioning and operating the beam line. This paper shows how a ``multi-track`` simulation and analysis code can be used for these applications.

  8. Computer-simulated phacoemulsification

    NASA Astrophysics Data System (ADS)

    Laurell, Carl-Gustaf; Nordh, Leif; Skarman, Eva; Andersson, Mats; Nordqvist, Per

    2001-06-01

    Phacoemulsification makes the cataract operation easier for the patient but involves a demanding technique for the surgeon. It is therefore important to increase the quality of surgical training in order to shorten the learning period for the beginner. This should diminish the risks of the patient. We are developing a computer-based simulator for training of phacoemulsification. The simulator is built on a platform that can be used as a basis for several different training simulators. A prototype has been made that has been partly tested by experienced surgeons.

  9. Ideal free distribution or dynamic game? An agent-based simulation study of trawling strategies with varying information

    NASA Astrophysics Data System (ADS)

    Beecham, J. A.; Engelhard, G. H.

    2007-10-01

    An ecological economic model of trawling is presented to demonstrate the effect of trawling location choice strategy on net input (rate of economic gain of fish caught per time spent less costs). Fishing location choice is considered to be a dynamic process whereby trawlers chose from among a repertoire of plastic strategies that they modify if their gains fall below a fixed proportion of the mean gains of the fleet as a whole. The distribution of fishing across different areas of a fishery follows an approximate ideal free distribution (IFD) with varying noise due to uncertainty. The least-productive areas are not utilised because initial net input never reaches the mean yield of better areas subject to competitive exploitation. In cases, where there is a weak temporal autocorrelation between fish stocks in a specific location, a plastic strategy of local translocation between trawls mixed with longer-range translocation increases realised input. The trawler can change its translocation strategy in the light of information about recent trawling success compared to its long-term average but, in contrast to predictions of the Marginal Value Theorem (MVT) model, does not know for certain what it will find by moving, so may need to sample new patches. The combination of the two types of translocation mirrored beam-trawling strategies used by the Dutch fleet and the resultant distribution of trawling effort is confirmed by analysis of historical effort distribution of British otter trawling fleets in the North Sea. Fisheries exploitation represents an area where dynamic agent-based adaptive models may be a better representation of the economic dynamics of a fleet than classically inspired optimisation models.

  10. Adding ecosystem function to agent-based land use models

    PubMed Central

    Yadav, V.; Del Grosso, S.J.; Parton, W.J.; Malanson, G.P.

    2015-01-01

    The objective of this paper is to examine issues in the inclusion of simulations of ecosystem functions in agent-based models of land use decision-making. The reasons for incorporating these simulations include local interests in land fertility and global interests in carbon sequestration. Biogeochemical models are needed in order to calculate such fluxes. The Century model is described with particular attention to the land use choices that it can encompass. When Century is applied to a land use problem the combinatorial choices lead to a potentially unmanageable number of simulation runs. Century is also parameter-intensive. Three ways of including Century output in agent-based models, ranging from separately calculated look-up tables to agents running Century within the simulation, are presented. The latter may be most efficient, but it moves the computing costs to where they are most problematic. Concern for computing costs should not be a roadblock. PMID:26191077

  11. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  12. Learning to Measure Biodiversity: Two Agent-Based Models that Simulate Sampling Methods & Provide Data for Calculating Diversity Indices

    ERIC Educational Resources Information Center

    Jones, Thomas; Laughlin, Thomas

    2009-01-01

    Nothing could be more effective than a wilderness experience to demonstrate the importance of conserving biodiversity. When that is not possible, though, there are computer models with several features that are helpful in understanding how biodiversity is measured. These models are easily used when natural resources, transportation, and time…

  13. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1977-01-01

    In a computer simulation study of earthquakes a seismically active strike slip fault is represented by coupled mechanical blocks which are driven by a moving plate and which slide on a friction surface. Elastic forces and time independent friction are used to generate main shock events, while viscoelastic forces and time dependent friction add aftershock features. The study reveals that the size, length, and time and place of event occurrence are strongly influenced by the magnitude and degree of homogeneity in the elastic, viscous, and friction parameters of the fault region. For example, periodically reoccurring similar events are observed in simulations with near-homogeneous parameters along the fault, whereas seismic gaps are a common feature of simulations employing large variations in the fault parameters. The study also reveals correlations between strain energy release and fault length and average displacement and between main shock and aftershock displacements.

  14. Integrating the simulation of domestic water demand behaviour to an urban water model using agent based modelling

    NASA Astrophysics Data System (ADS)

    Koutiva, Ifigeneia; Makropoulos, Christos

    2015-04-01

    The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model

  15. Computer simulation of earthquakes

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1976-01-01

    Two computer simulation models of earthquakes were studied for the dependence of the pattern of events on the model assumptions and input parameters. Both models represent the seismically active region by mechanical blocks which are connected to one another and to a driving plate. The blocks slide on a friction surface. In the first model elastic forces were employed and time independent friction to simulate main shock events. The size, length, and time and place of event occurrence were influenced strongly by the magnitude and degree of homogeniety in the elastic and friction parameters of the fault region. Periodically reoccurring similar events were frequently observed in simulations with near homogeneous parameters along the fault, whereas, seismic gaps were a common feature of simulations employing large variations in the fault parameters. The second model incorporated viscoelastic forces and time-dependent friction to account for aftershock sequences. The periods between aftershock events increased with time and the aftershock region was confined to that which moved in the main event.

  16. Who's your neighbor? neighbor identification for agent-based modeling.

    SciTech Connect

    Macal, C. M.; Howe, T. R.; Decision and Information Sciences; Univ. of Chicago

    2006-01-01

    Agent-based modeling and simulation, based on the cellular automata paradigm, is an approach to modeling complex systems comprised of interacting autonomous agents. Open questions in agent-based simulation focus on scale-up issues encountered in simulating large numbers of agents. Specifically, how many agents can be included in a workable agent-based simulation? One of the basic tenets of agent-based modeling and simulation is that agents only interact and exchange locally available information with other agents located in their immediate proximity or neighborhood of the space in which the agents are situated. Generally, an agent's set of neighbors changes rapidly as a simulation proceeds through time and as the agents move through space. Depending on the topology defined for agent interactions, proximity may be defined by spatial distance for continuous space, adjacency for grid cells (as in cellular automata), or by connectivity in social networks. Identifying an agent's neighbors is a particularly time-consuming computational task and can dominate the computational effort in a simulation. Two challenges in agent simulation are (1) efficiently representing an agent's neighborhood and the neighbors in it and (2) efficiently identifying an agent's neighbors at any time in the simulation. These problems are addressed differently for different agent interaction topologies. While efficient approaches have been identified for agent neighborhood representation and neighbor identification for agents on a lattice with general neighborhood configurations, other techniques must be used when agents are able to move freely in space. Techniques for the analysis and representation of spatial data are applicable to the agent neighbor identification problem. This paper extends agent neighborhood simulation techniques from the lattice topology to continuous space, specifically R2. Algorithms based on hierarchical (quad trees) or non-hierarchical data structures (grid cells) are

  17. Use of an agent-based simulation model to evaluate a mobile-based system for supporting emergency evacuation decision making.

    PubMed

    Tian, Yu; Zhou, Tian-Shu; Yao, Qin; Zhang, Mao; Li, Jing-Song

    2014-12-01

    Recently, mass casualty incidents (MCIs) have been occurring frequently and have gained international attention. There is an urgent need for scientifically proven and effective emergency responses to MCIs, particularly as the severity of incidents is continuously increasing. The emergency response to MCIs is a multi-dimensional and multi-participant dynamic process that changes in real-time. The evacuation decisions that assign casualties to different hospitals in a region are very important and impact both the results of emergency treatment and the efficiency of medical resource utilization. Previously, decisions related to casualty evacuation were made by an incident commander with emergency experience and in accordance with macro emergency guidelines. There are few decision-supporting tools available to reduce the difficulty and psychological pressure associated with the evacuation decisions an incident commander must make. In this study, we have designed a mobile-based system to collect medical and temporal data produced during an emergency response to an MCI. Using this information, our system's decision-making model can provide personal evacuation suggestions that improve the overall outcome of an emergency response. The effectiveness of our system in reducing overall mortality has been validated by an agent-based simulation model established to simulate an emergency response to an MCI. PMID:25354665

  18. Understanding Islamist political violence through computational social simulation

    SciTech Connect

    Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G; Eberhardt, Ariane; Stradling, Seth G

    2008-01-01

    Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates the computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.

  19. An agent based model of genotype editing

    SciTech Connect

    Rocha, L. M.; Huang, C. F.

    2004-01-01

    This paper presents our investigation on an agent-based model of Genotype Editing. This model is based on several characteristics that are gleaned from the RNA editing system as observed in several organisms. The incorporation of editing mechanisms in an evolutionary agent-based model provides a means for evolving agents with heterogenous post-transcriptional processes. The study of this agent-based genotype-editing model has shed some light into the evolutionary implications of RNA editing as well as established an advantageous evolutionary computation algorithm for machine learning. We expect that our proposed model may both facilitate determining the evolutionary role of RNA editing in biology, and advance the current state of research in agent-based optimization.

  20. Intelligence Assessment with Computer Simulations

    ERIC Educational Resources Information Center

    Kroner, S.; Plass, J.L.; Leutner, D.

    2005-01-01

    It has been suggested that computer simulations may be used for intelligence assessment. This study investigates what relationships exist between intelligence and computer-simulated tasks that mimic real-world problem-solving behavior, and discusses design requirements that simulations have to meet in order to be suitable for intelligence…

  1. Modeling the 2014 Ebola Virus Epidemic – Agent-Based Simulations, Temporal Analysis and Future Predictions for Liberia and Sierra Leone

    PubMed Central

    Siettos, Constantinos; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2015-01-01

    We developed an agent-based model to investigate the epidemic dynamics of Ebola virus disease (EVD) in Liberia and Sierra Leone from May 27 to December 21, 2014. The dynamics of the agent-based simulator evolve on small-world transmission networks of sizes equal to the population of each country, with adjustable densities to account for the effects of public health intervention policies and individual behavioral responses to the evolving epidemic. Based on time series of the official case counts from the World Health Organization (WHO), we provide estimates for key epidemiological variables by employing the so-called Equation-Free approach. The underlying transmission networks were characterized by rather random structures in the two countries with densities decreasing by ~19% from the early (May 27-early August) to the last period (mid October-December 21). Our estimates for the values of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate, are very close to the ones reported by the WHO Ebola response team during the early period of the epidemic (until September 14) that were calculated based on clinical data. Specifically, regarding the effective reproductive number Re, our analysis suggests that until mid October, Re was above 2.3 in both countries; from mid October to December 21, Re dropped well below unity in Liberia, indicating a saturation of the epidemic, while in Sierra Leone it was around 1.9, indicating an ongoing epidemic. Accordingly, a ten-week projection from December 21 estimated that the epidemic will fade out in Liberia in early March; in contrast, our results flashed a note of caution for Sierra Leone since the cumulative number of cases could reach as high as 18,000, and the number of deaths might exceed 5,000, by early March 2015. However, by processing the reported data of the very last period (December 21, 2014-January 18, 2015), we obtained more optimistic estimates indicative of a remission of

  2. Agent-based models in translational systems biology

    PubMed Central

    An, Gary; Mi, Qi; Dutta-Moscato, Joyeeta; Vodovotz, Yoram

    2013-01-01

    Effective translational methodologies for knowledge representation are needed in order to make strides against the constellation of diseases that affect the world today. These diseases are defined by their mechanistic complexity, redundancy, and nonlinearity. Translational systems biology aims to harness the power of computational simulation to streamline drug/device design, simulate clinical trials, and eventually to predict the effects of drugs on individuals. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggests that this modeling framework is well suited for translational systems biology. This review describes agent-based modeling and gives examples of its translational applications in the context of acute inflammation and wound healing. PMID:20835989

  3. Using Agent Base Models to Optimize Large Scale Network for Large System Inventories

    NASA Technical Reports Server (NTRS)

    Shameldin, Ramez Ahmed; Bowling, Shannon R.

    2010-01-01

    The aim of this paper is to use Agent Base Models (ABM) to optimize large scale network handling capabilities for large system inventories and to implement strategies for the purpose of reducing capital expenses. The models used in this paper either use computational algorithms or procedure implementations developed by Matlab to simulate agent based models in a principal programming language and mathematical theory using clusters, these clusters work as a high performance computational performance to run the program in parallel computational. In both cases, a model is defined as compilation of a set of structures and processes assumed to underlie the behavior of a network system.

  4. Agent-Based Computational Modeling of Cell Culture: Understanding Dosimetry In Vitro as Part of In Vitro to In Vivo Extrapolation

    EPA Science Inventory

    Quantitative characterization of cellular dose in vitro is needed for alignment of doses in vitro and in vivo. We used the agent-based software, CompuCell3D (CC3D), to provide a stochastic description of cell growth in culture. The model was configured so that isolated cells assu...

  5. Investigating biocomplexity through the agent-based paradigm

    PubMed Central

    Kaul, Himanshu

    2015-01-01

    Capturing the dynamism that pervades biological systems requires a computational approach that can accommodate both the continuous features of the system environment as well as the flexible and heterogeneous nature of component interactions. This presents a serious challenge for the more traditional mathematical approaches that assume component homogeneity to relate system observables using mathematical equations. While the homogeneity condition does not lead to loss of accuracy while simulating various continua, it fails to offer detailed solutions when applied to systems with dynamically interacting heterogeneous components. As the functionality and architecture of most biological systems is a product of multi-faceted individual interactions at the sub-system level, continuum models rarely offer much beyond qualitative similarity. Agent-based modelling is a class of algorithmic computational approaches that rely on interactions between Turing-complete finite-state machines—or agents—to simulate, from the bottom-up, macroscopic properties of a system. In recognizing the heterogeneity condition, they offer suitable ontologies to the system components being modelled, thereby succeeding where their continuum counterparts tend to struggle. Furthermore, being inherently hierarchical, they are quite amenable to coupling with other computational paradigms. The integration of any agent-based framework with continuum models is arguably the most elegant and precise way of representing biological systems. Although in its nascence, agent-based modelling has been utilized to model biological complexity across a broad range of biological scales (from cells to societies). In this article, we explore the reasons that make agent-based modelling the most precise approach to model biological systems that tend to be non-linear and complex. PMID:24227161

  6. Massively parallel quantum computer simulator

    NASA Astrophysics Data System (ADS)

    De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.

    2007-01-01

    We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.

  7. Software simulator for multiple computer simulation system

    NASA Technical Reports Server (NTRS)

    Ogrady, E. P.

    1983-01-01

    A description is given of the structure and use of a computer program that simulates the operation of a parallel processor simulation system. The program is part of an investigation to determine algorithms that are suitable for simulating continous systems on a parallel processor configuration. The simulator is designed to accurately simulate the problem-solving phase of a simulation study. Care has been taken to ensure the integrity and correctness of data exchanges and to correctly sequence periods of computation and periods of data exchange. It is pointed out that the functions performed during a problem-setup phase or a reset phase are not simulated. In particular, there is no attempt to simulate the downloading process that loads object code into the local, transfer, and mapping memories of processing elements or the memories of the run control processor and the system control processor. The main program of the simulator carries out some problem-setup functions of the system control processor in that it requests the user to enter values for simulation system parameters and problem parameters. The method by which these values are transferred to the other processors, however, is not simulated.

  8. Computer simulation of space charge

    NASA Astrophysics Data System (ADS)

    Yu, K. W.; Chung, W. K.; Mak, S. S.

    1991-05-01

    Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.

  9. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  10. Computer Simulation of Mutagenesis.

    ERIC Educational Resources Information Center

    North, J. C.; Dent, M. T.

    1978-01-01

    A FORTRAN program is described which simulates point-substitution mutations in the DNA strands of typical organisms. Its objective is to help students to understand the significance and structure of the genetic code, and the mechanisms and effect of mutagenesis. (Author/BB)

  11. Computer Simulation and Library Management.

    ERIC Educational Resources Information Center

    Main, Linda

    1992-01-01

    Reviews the literature on computer simulation modeling for library management and examines whether simulation is underutilized because the models are too complex and mathematical. Other problems with implementation are considered, including incomplete problem definition, lack of a conceptual framework, system constraints, lack of interaction…

  12. Plasma physics via computer simulation

    SciTech Connect

    Birdsall, C.K.; Langdon, A.B.

    1985-01-01

    This book describes the computerized simulation of plasma kinetics. Topics considered include why attempting to do plasma physics via computer simulation using particles makes good physical sense; overall view of a one-dimensional electrostatic program; a one-dimensional electrostatic program; introduction to the numerical methods used; a 1d electromagnetic program; projects for EM1; effects of the spatial grid; effects of the finite time step; energy-conserving simulation models; multipole models; kinetic theory for fluctuations and noise; collisions; statistical mechanics of a sheet plasma; electrostatic programs in two and three dimensions; electromagnetic programs in 2D and 3D; design of computer experiments; and the choice of parameters.

  13. Composite Erosion by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    Composite degradation is evaluated by computational simulation when the erosion degradation occurs on a ply-by-ply basis and the degrading medium (device) is normal to the ply. The computational simulation is performed by a multi factor interaction model and by a multi scale and multi physics available computer code. The erosion process degrades both the fiber and the matrix simultaneously in the same slice (ply). Both the fiber volume ratio and the matrix volume ratio approach zero while the void volume ratio increases as the ply degrades. The multi factor interaction model simulates the erosion degradation, provided that the exponents and factor ratios are selected judiciously. Results obtained by the computational composite mechanics show that most composite characterization properties degrade monotonically and approach "zero" as the ply degrades completely.

  14. Agent-based modeling of urban land-use change

    NASA Astrophysics Data System (ADS)

    Li, Xinyan; Li, Deren

    2005-10-01

    ABM (Agent-Based Modeling) is a newly developed method of computer simulation. It has characteristics such as active, dynamic, and operational. Urban land-use change has been a focus problem all over the world, especially for the developing countries. We try to use ABM to model the urban land-use changes. By studying the mechanism of urban land use evolvement, we put forwards the thinking of modeling. And an urban land-use change model is built primarily based on the RePast software and GIS spatial database.

  15. Interactive agent based modeling of public health decision-making.

    PubMed

    Parks, Amanda L; Walker, Brett; Pettey, Warren; Benuzillo, Jose; Gesteland, Per; Grant, Juliana; Koopman, James; Drews, Frank; Samore, Matthew

    2009-01-01

    Agent-based models have yielded important insights regarding the transmission dynamics of communicable diseases. To better understand how these models can be used to study decision making of public health officials, we developed a computer program that linked an agent-based model of pertussis with an agent-based model of public health management. The program, which we call the Public Health Interactive Model & simulation (PHIMs) encompassed the reporting of cases to public health, case investigation, and public health response. The user directly interacted with the model in the role of the public health decision-maker. In this paper we describe the design of our model, and present the results of a pilot study to assess its usability and potential for future development. Affinity for specific tools was demonstrated. Participants ranked the program high in usability and considered it useful for training. Our ultimate goal is to achieve better public health decisions and outcomes through use of public health decision support tools. PMID:20351907

  16. Computer Simulation of Aircraft Aerodynamics

    NASA Technical Reports Server (NTRS)

    Inouye, Mamoru

    1989-01-01

    The role of Ames Research Center in conducting basic aerodynamics research through computer simulations is described. The computer facilities, including supercomputers and peripheral equipment that represent the state of the art, are described. The methodology of computational fluid dynamics is explained briefly. Fundamental studies of turbulence and transition are being pursued to understand these phenomena and to develop models that can be used in the solution of the Reynolds-averaged Navier-Stokes equations. Four applications of computer simulations for aerodynamics problems are described: subsonic flow around a fuselage at high angle of attack, subsonic flow through a turbine stator-rotor stage, transonic flow around a flexible swept wing, and transonic flow around a wing-body configuration that includes an inlet and a tail.

  17. An Agent-Based Cockpit Task Management System

    NASA Technical Reports Server (NTRS)

    Funk, Ken

    1997-01-01

    An agent-based program to facilitate Cockpit Task Management (CTM) in commercial transport aircraft is developed and evaluated. The agent-based program called the AgendaManager (AMgr) is described and evaluated in a part-task simulator study using airline pilots.

  18. Agent-Based Modeling of Cancer Stem Cell Driven Solid Tumor Growth.

    PubMed

    Poleszczuk, Jan; Macklin, Paul; Enderling, Heiko

    2016-01-01

    Computational modeling of tumor growth has become an invaluable tool to simulate complex cell-cell interactions and emerging population-level dynamics. Agent-based models are commonly used to describe the behavior and interaction of individual cells in different environments. Behavioral rules can be informed and calibrated by in vitro assays, and emerging population-level dynamics may be validated with both in vitro and in vivo experiments. Here, we describe the design and implementation of a lattice-based agent-based model of cancer stem cell driven tumor growth. PMID:27044046

  19. Taxis through Computer Simulation Programs.

    ERIC Educational Resources Information Center

    Park, David

    1983-01-01

    Describes a sequence of five computer programs (listings for Apple II available from author) on tactic responses (oriented movement of a cell, cell group, or whole organism in reponse to stimuli). The simulation programs are useful in helping students examine mechanisms at work in real organisms. (JN)

  20. Computer Simulation of Diffraction Patterns.

    ERIC Educational Resources Information Center

    Dodd, N. A.

    1983-01-01

    Describes an Apple computer program (listing available from author) which simulates Fraunhofer and Fresnel diffraction using vector addition techniques (vector chaining) and allows user to experiment with different shaped multiple apertures. Graphics output include vector resultants, phase difference, diffraction patterns, and the Cornu spiral…

  1. Agent-Based Modeling in Systems Pharmacology.

    PubMed

    Cosgrove, J; Butler, J; Alden, K; Read, M; Kumar, V; Cucurull-Sanchez, L; Timmis, J; Coles, M

    2015-11-01

    Modeling and simulation (M&S) techniques provide a platform for knowledge integration and hypothesis testing to gain insights into biological systems that would not be possible a priori. Agent-based modeling (ABM) is an M&S technique that focuses on describing individual components rather than homogenous populations. This tutorial introduces ABM to systems pharmacologists, using relevant case studies to highlight how ABM-specific strengths have yielded success in the area of preclinical mechanistic modeling. PMID:26783498

  2. SPARK: A Framework for Multi-Scale Agent-Based Biomedical Modeling

    PubMed Central

    Solovyev, Alexey; Mikheev, Maxim; Zhou, Leming; Dutta-Moscato, Joyeeta; Ziraldo, Cordelia; An, Gary; Vodovotz, Yoram; Mi, Qi

    2013-01-01

    Multi-scale modeling of complex biological systems remains a central challenge in the systems biology community. A method of dynamic knowledge representation known as agent-based modeling enables the study of higher level behavior emerging from discrete events performed by individual components. With the advancement of computer technology, agent-based modeling has emerged as an innovative technique to model the complexities of systems biology. In this work, the authors describe SPARK (Simple Platform for Agent-based Representation of Knowledge), a framework for agent-based modeling specifically designed for systems-level biomedical model development. SPARK is a stand-alone application written in Java. It provides a user-friendly interface, and a simple programming language for developing Agent-Based Models (ABMs). SPARK has the following features specialized for modeling biomedical systems: 1) continuous space that can simulate real physical space; 2) flexible agent size and shape that can represent the relative proportions of various cell types; 3) multiple spaces that can concurrently simulate and visualize multiple scales in biomedical models; 4) a convenient graphical user interface. Existing ABMs of diabetic foot ulcers and acute inflammation were implemented in SPARK. Models of identical complexity were run in both NetLogo and SPARK; the SPARK-based models ran two to three times faster. PMID:24163721

  3. Discovery of novel anti-HIV-1 agents based on a broadly neutralizing antibody against the envelope gp120 V3 loop: a computational study.

    PubMed

    Andrianov, Alexander M; Kashyn, Ivan A; Tuzikov, Alexander V

    2014-12-01

    A computer-aided search for novel anti-HIV-1 agents able to mimic the pharmacophore properties of broadly neutralizing antibody (bNAb) 3074 was carried out based on the analysis of X-ray complexes of this antibody Fab with the MN, UR29, and VI191 peptides from the V3 loop of the HIV envelope protein gp120. Using these empirical data, peptidomimetic candidates of bNAb 3074 were identified by a public, web-oriented virtual screening platform (pepMMsMIMIC) and models of these candidates bound to the above V3 peptides were generated by molecular docking. The docking calculations identified four molecules exhibiting a high affinity to all of the V3 peptides. These molecules were selected as the most probable peptidomimetics of bNAb 3074. Finally, the stability of the complexes of these molecules with the MN, UR29, and VI191 V3 peptides was estimated by molecular dynamics and free energy simulations. Specific binding to the V3 loop was accomplished primarily by π-π interactions between the aromatic rings of the peptidomimetics and the conserved Phe-20 and/or Tyr-21 of the V3 immunogenic crown. In a mechanism similar to that of bNAb 3074, these compounds were found to block the tip of the V3 loop forming its invariant structural motif that contains residues critical for cell tropism. Based on these findings, the compounds selected are considered as promising basic structures for the rational design of novel, potent, and broad-spectrum anti-HIV-1 therapeutics. PMID:24251545

  4. Agent-based computational model of the prevalence of gonococcal infections after the implementation of HIV pre-exposure prophylaxis guidelines.

    PubMed

    Escobar, Erik; Durgham, Ryan; Dammann, Olaf; Stopka, Thomas J

    2015-01-01

    Recently, the first comprehensive guidelines were published for pre-exposure prophylaxis (PrEP) for the prevention of HIV infection in populations with substantial risk of infection. Guidelines include a daily regimen of emtricitabine/tenofovir disoproxil fumarate (TDF/FTC) as well as condom usage during sexual activity. The relationship between the TDF/FTC intake regimen and condom usage is not yet fully understood. If men who have sex with men (MSM,) engage in high-risk sexual activities without using condoms when prescribed TDF/FTC they might be at an increased risk for other sexually transmitted diseases (STD). Our study focuses on the possible occurrence of behavioral changes among MSM in the United States over time with regard to condom usage. In particular, we were interested in creating a model of how increased uptake of TDF/FTC might cause a decline in condom usage, causing significant increases in non-HIV STD incidence, using gonococcal infection incidence as a biological endpoint. We used the agent-based modeling software NetLogo, building upon an existing model of HIV infection. We found no significant evidence for increased gonorrhea prevalence due to increased PrEP usage at any level of sample-wide usage, with a range of 0-90% PrEP usage. However, we did find significant evidence for decreased prevalence of HIV, with a maximal effect being reached when 5% to 10% of the MSM population used PrEP. Our findings appear to indicate that attitudes of aversion, within the medical community, toward the promotion of PrEP due to the potential risk of increased STD transmission are unfounded. PMID:26834937

  5. Computer simulation of martensitic transformations

    SciTech Connect

    Xu, Ping

    1993-11-01

    The characteristics of martensitic transformations in solids are largely determined by the elastic strain that develops as martensite particles grow and interact. To study the development of microstructure, a finite-element computer simulation model was constructed to mimic the transformation process. The transformation is athermal and simulated at each incremental step by transforming the cell which maximizes the decrease in the free energy. To determine the free energy change, the elastic energy developed during martensite growth is calculated from the theory of linear elasticity for elastically homogeneous media, and updated as the transformation proceeds.

  6. Temperature estimators in computer simulation

    NASA Astrophysics Data System (ADS)

    Jara, César; González-Cataldo, Felipe; Davis, Sergio; Gutiérrez, Gonzalo

    2016-05-01

    Temperature is a key physical quantity that is used to describe equilibrium between two bodies in thermal contact. In computer simulations, the temperature is usually estimated by means of the equipartition theorem, as an average over the kinetic energy. However, recent studies have shown that the temperature can be estimated using only the particles positions, which has been called configurational temperature. Through classical molecular dynamics simulations of 108-argon-atoms system, we compare the performance of four different temperature estimators: the usual kinetic temperature and three configurational temperatures, Our results show that the different estimators converge to the same value, but their fluctuations are different.

  7. Computer Simulations of Space Plasmas

    NASA Astrophysics Data System (ADS)

    Goertz, C. K.

    Even a superficial scanning of the latest issues of the Journal of Geophysical Research reveals that numerical simulation of space plasma processes is an active and growing field. The complexity and sophistication of numerically produced “data” rivals that of the real stuff. Sometimes numerical results need interpretation in terms of a simple “theory,” very much as the results of real experiments and observations do. Numerical simulation has indeed become a third independent tool of space physics, somewhere between observations and analytic theory. There is thus a strong need for textbooks and monographs that report the latest techniques and results in an easily accessible form. This book is an attempt to satisfy this need. The editors want it not only to be “proceedings of selected lectures (given) at the first ISSS (International School of Space Simulations in Kyoto, Japan, November 1-2, 1982) but rather…a form of textbook of computer simulations of space plasmas.” This is, of course, a difficult task when many authors are involved. Unavoidable redundancies and differences in notation may confuse the beginner. Some important questions, like numerical stability, are not discussed in sufficient detail. The recent book by C.K. Birdsall and A.B. Langdon (Plasma Physics via Computer Simulations, McGraw-Hill, New York, 1985) is more complete and detailed and seems more suitable as a textbook for simulations. Nevertheless, this book is useful to the beginner and the specialist because it contains not only descriptions of various numerical techniques but also many applications of simulations to space physics phenomena.

  8. Biomes computed from simulated climatologies

    SciTech Connect

    Claussen, M.; Esch, M.

    1994-01-01

    The biome model of Prentice et al. is used to predict global patterns of potential natural plant formations, or biomes, from climatologies simulated by ECHAM, a model used for climate simulations at the Max-Planck-Institut fuer Meteorologie. This study undertaken in order to show the advantage of this biome model in diagnosing the performance of a climate model and assessing effects of past and future climate changes predicted by a climate model. Good overall agreement is found between global patterns of biomes computed from observed and simulated data of present climate. But there are also major discrepancies indicated by a difference in biomes in Australia, in the Kalahari Desert, and in the Middle West of North America. These discrepancies can be traced back to in simulated rainfall as well as summer or winter temperatures. Global patterns of biomes computed from an ice age simulation reveal that North America, Europe, and Siberia should have been covered largely by tundra and taiga, whereas only small differences are for the tropical rain forests. A potential northeast shift of biomes is expected from a simulation with enhanced CO{sub 2} concentration according to the IPCC Scenario A. Little change is seen in the tropical rain forest and the Sahara. Since the biome model used is not capable of predicting chances in vegetation patterns due to a rapid climate change, the latter simulation to be taken as a prediction of chances in conditions favourable for the existence of certain biomes, not as a reduction of a future distribution of biomes. 15 refs., 8 figs., 2 tabs.

  9. Computer simulation of nonequilibrium processes

    SciTech Connect

    Wallace, D.C.

    1985-07-01

    The underlying concepts of nonequilibrium statistical mechanics, and of irreversible thermodynamics, will be described. The question at hand is then, how are these concepts to be realize in computer simulations of many-particle systems. The answer will be given for dissipative deformation processes in solids, on three hierarchical levels: heterogeneous plastic flow, dislocation dynamics, an molecular dynamics. Aplication to the shock process will be discussed.

  10. Agent-Based Literacy Theory

    ERIC Educational Resources Information Center

    McEneaney, John E.

    2006-01-01

    The purpose of this theoretical essay is to explore the limits of traditional conceptualizations of reader and text and to propose a more general theory based on the concept of a literacy agent. The proposed theoretical perspective subsumes concepts from traditional theory and aims to account for literacy online. The agent-based literacy theory…

  11. Inversion based on computational simulations

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Saquib, S.S.

    1998-09-01

    A standard approach to solving inversion problems that involve many parameters uses gradient-based optimization to find the parameters that best match the data. The authors discuss enabling techniques that facilitate application of this approach to large-scale computational simulations, which are the only way to investigate many complex physical phenomena. Such simulations may not seem to lend themselves to calculation of the gradient with respect to numerous parameters. However, adjoint differentiation allows one to efficiently compute the gradient of an objective function with respect to all the variables of a simulation. When combined with advanced gradient-based optimization algorithms, adjoint differentiation permits one to solve very large problems of optimization or parameter estimation. These techniques will be illustrated through the simulation of the time-dependent diffusion of infrared light through tissue, which has been used to perform optical tomography. The techniques discussed have a wide range of applicability to modeling including the optimization of models to achieve a desired design goal.

  12. An agent-based microsimulation of critical infrastructure systems

    SciTech Connect

    BARTON,DIANNE C.; STAMBER,KEVIN L.

    2000-03-29

    US infrastructures provide essential services that support the economic prosperity and quality of life. Today, the latest threat to these infrastructures is the increasing complexity and interconnectedness of the system. On balance, added connectivity will improve economic efficiency; however, increased coupling could also result in situations where a disturbance in an isolated infrastructure unexpectedly cascades across diverse infrastructures. An understanding of the behavior of complex systems can be critical to understanding and predicting infrastructure responses to unexpected perturbation. Sandia National Laboratories has developed an agent-based model of critical US infrastructures using time-dependent Monte Carlo methods and a genetic algorithm learning classifier system to control decision making. The model is currently under development and contains agents that represent the several areas within the interconnected infrastructures, including electric power and fuel supply. Previous work shows that agent-based simulations models have the potential to improve the accuracy of complex system forecasting and to provide new insights into the factors that are the primary drivers of emergent behaviors in interdependent systems. Simulation results can be examined both computationally and analytically, offering new ways of theorizing about the impact of perturbations to an infrastructure network.

  13. Agent-based forward analysis

    SciTech Connect

    Kerekes, Ryan A.; Jiao, Yu; Shankar, Mallikarjun; Potok, Thomas E.; Lusk, Rick M.

    2008-01-01

    We propose software agent-based "forward analysis" for efficient information retrieval in a network of sensing devices. In our approach, processing is pushed to the data at the edge of the network via intelligent software agents rather than pulling data to a central facility for processing. The agents are deployed with a specific query and perform varying levels of analysis of the data, communicating with each other and sending only relevant information back across the network. We demonstrate our concept in the context of face recognition using a wireless test bed comprised of PDA cell phones and laptops. We show that agent-based forward analysis can provide a significant increase in retrieval speed while decreasing bandwidth usage and information overload at the central facility. n

  14. Agent based simulations in disease modeling Comment on "Towards a unified approach in the modeling of fibrosis: A review with research perspectives" by Martine Ben Amar and Carlo Bianca

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Pennisi, Marzio

    2016-07-01

    Fibrosis represents a process where an excessive tissue formation in an organ follows the failure of a physiological reparative or reactive process. Mathematical and computational techniques may be used to improve the understanding of the mechanisms that lead to the disease and to test potential new treatments that may directly or indirectly have positive effects against fibrosis [1]. In this scenario, Ben Amar and Bianca [2] give us a broad picture of the existing mathematical and computational tools that have been used to model fibrotic processes at the molecular, cellular, and tissue levels. Among such techniques, agent based models (ABM) can give a valuable contribution in the understanding and better management of fibrotic diseases.

  15. Exploring cooperation and competition using agent-based modeling

    PubMed Central

    Elliott, Euel; Kiel, L. Douglas

    2002-01-01

    Agent-based modeling enhances our capacity to model competitive and cooperative behaviors at both the individual and group levels of analysis. Models presented in these proceedings produce consistent results regarding the relative fragility of cooperative regimes among agents operating under diverse rules. These studies also show how competition and cooperation may generate change at both the group and societal level. Agent-based simulation of competitive and cooperative behaviors may reveal the greatest payoff to social science research of all agent-based modeling efforts because of the need to better understand the dynamics of these behaviors in an increasingly interconnected world. PMID:12011396

  16. Cell-based computational modeling of vascular morphogenesis using Tissue Simulation Toolkit.

    PubMed

    Daub, Josephine T; Merks, Roeland M H

    2015-01-01

    Computational modeling has become a widely used tool for unraveling the mechanisms of higher level cooperative cell behavior during vascular morphogenesis. However, experimenting with published simulation models or adding new assumptions to those models can be daunting for novice and even for experienced computational scientists. Here, we present a step-by-step, practical tutorial for building cell-based simulations of vascular morphogenesis using the Tissue Simulation Toolkit (TST). The TST is a freely available, open-source C++ library for developing simulations with the two-dimensional cellular Potts model, a stochastic, agent-based framework to simulate collective cell behavior. We will show the basic use of the TST to simulate and experiment with published simulations of vascular network formation. Then, we will present step-by-step instructions and explanations for building a recent simulation model of tumor angiogenesis. Demonstrated mechanisms include cell-cell adhesion, chemotaxis, cell elongation, haptotaxis, and haptokinesis. PMID:25468600

  17. Computer simulations of particle packing

    SciTech Connect

    Cesarano, J. III; McEuen, M.J.; Swiler, T.

    1996-09-01

    Computer code has been developed to rapidly simulate the random packing of disks and spheres in two and three dimensions. Any size distribution may be packed. The code simulates varying degrees of inter particle conditions ranging from sticky to free flowing. The code will also calculate the overall packing density, density distributions, and void size distributions (in two dimensions). An important aspect of the code is that it is written in C++ and incorporates a user-friendly graphical interface for standard Macintosh and Power PC platforms. Investigations as to how well the code simulates the realistic random packing have begun. The code has been developed in consideration of the problem of filling a container (or die) with spray-dried granules of ceramic powder (represented by spheres). Although not presented here, the futuristic goal of this work is to give users the ability to predict homogeneity of filled dies prior to dry pressing. Additionally, this software has educational utility for studying relationships between particle size distributions and macrostructures.

  18. Computer simulation of microstructural dynamics

    SciTech Connect

    Grest, G.S.; Anderson, M.P.; Srolovitz, D.J.

    1985-01-01

    Since many of the physical properties of materials are determined by their microstructure, it is important to be able to predict and control microstructural development. A number of approaches have been taken to study this problem, but they assume that the grains can be described as spherical or hexagonal and that growth occurs in an average environment. We have developed a new technique to bridge the gap between the atomistic interactions and the macroscopic scale by discretizing the continuum system such that the microstructure retains its topological connectedness, yet is amenable to computer simulations. Using this technique, we have studied grain growth in polycrystalline aggregates. The temporal evolution and grain morphology of our model are in excellent agreement with experimental results for metals and ceramics.

  19. Priority Queues for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor)

    1998-01-01

    The present invention is embodied in new priority queue data structures for event list management of computer simulations, and includes a new priority queue data structure and an improved event horizon applied to priority queue data structures. ne new priority queue data structure is a Qheap and is made out of linked lists for robust, fast, reliable, and stable event list management and uses a temporary unsorted list to store all items until one of the items is needed. Then the list is sorted, next, the highest priority item is removed, and then the rest of the list is inserted in the Qheap. Also, an event horizon is applied to binary tree and splay tree priority queue data structures to form the improved event horizon for event management.

  20. Computer simulations of liquid crystals

    NASA Astrophysics Data System (ADS)

    Smondyrev, Alexander M.

    Liquid crystal physics is an exciting interdisciplinary field of research with important practical applications. Their complexity and the presence of strong translational and orientational fluctuations require a computational approach, especially in the studies of nonequlibrium phenomena. In this dissertation we present the results of computer simulation studies of liquid crystals using the molecular dynamics technique. We employed the Gay-Berne phenomenological model of liquid crystals to describe the interaction between the molecules. Both equilibrium and non-equilibrium phenomena were studied. In the first case we studied the flow properties of the liquid crystal system in equilibrium as well as the dynamics of the director. We measured the viscosities of the Gay-Berne model in the nematic and isotropic phases. The temperature-dependence of the rotational and shear viscosities, including the nonmonotonic behavior of one shear viscosity, are in good agreement with experimental data. The bulk viscosities are significantly larger than the shear viscosities, again in agreement with experiment. The director motion was found to be ballistic at short times and diffusive at longer times. The second class of problems we focused on is the properties of the system which was rapidly quenched to very low temperatures from the nematic phase. We find a glass transition to a metastable phase with nematic order and frozen translational and orientational degrees of freedom. For fast quench rates the local structure is nematic-like, while for slower quench rates smectic order is present as well. Finally, we considered a system in the isotropic phase which is then cooled to temperatures below the isotropic-nematic transition temperature. We expect topological defects to play a central role in the subsequent equilibration of the system. To identify and study these defects we require a simulation of a system with several thousand particles. We present the results of large

  1. AGENT-BASED MODELS IN EMPIRICAL SOCIAL RESEARCH*

    PubMed Central

    Bruch, Elizabeth; Atwell, Jon

    2014-01-01

    Agent-based modeling has become increasingly popular in recent years, but there is still no codified set of recommendations or practices for how to use these models within a program of empirical research. This article provides ideas and practical guidelines drawn from sociology, biology, computer science, epidemiology, and statistics. We first discuss the motivations for using agent-based models in both basic science and policy-oriented social research. Next, we provide an overview of methods and strategies for incorporating data on behavior and populations into agent-based models, and review techniques for validating and testing the sensitivity of agent-based models. We close with suggested directions for future research. PMID:25983351

  2. GIS and agent based spatial-temporal simulation modeling for assessing tourism social carrying capacity: a study on Mount Emei scenic area, China

    NASA Astrophysics Data System (ADS)

    Zhang, Renjun

    2007-06-01

    Each scenic area can sustain a specific level of acceptance of tourist development and use, beyond which further development can result in socio-cultural deterioration or a decline in the quality of the experience gained by visitors. This specific level is called carrying capacity. Social carrying capacity can be defined as the maximum level of use (in terms of numbers and activities) that can be absorbed by an area without an unacceptable decline in the quality of experience of visitors and without an unacceptable adverse impact on the society of the area. It is difficult to assess the carrying capacity, because the carrying capacity is determined by not only the number of visitors, but also the time, the type of the recreation, the characters of each individual and the physical environment. The objective of this study is to build a spatial-temporal simulation model to simulate the spatial-temporal distribution of tourists. This model is a tourist spatial behaviors simulator (TSBS). Based on TSBS, the changes of each visitor's travel patterns such as location, cost, and other states data are recoded in a state table. By analyzing this table, the intensity of the tourist use in any area can be calculated; the changes of the quality of tourism experience can be quantized and analyzed. So based on this micro simulation method the social carrying capacity can be assessed more accurately, can be monitored proactively and managed adaptively. In this paper, the carrying capacity of Mount Emei scenic area is analyzed as followed: The author selected the intensity of the crowd as the monitoring Indicators. it is regarded that longer waiting time means more crowded. TSBS was used to simulate the spatial-temporal distribution of tourists. the average of waiting time all the visitors is calculated. And then the author assessed the social carrying capacity of Mount Emei scenic area, found the key factors have impacted on social carrying capacity. The results show that the TSBS

  3. Fluctuation complexity of agent-based financial time series model by stochastic Potts system

    NASA Astrophysics Data System (ADS)

    Hong, Weijia; Wang, Jun

    2015-03-01

    Financial market is a complex evolved dynamic system with high volatilities and noises, and the modeling and analyzing of financial time series are regarded as the rather challenging tasks in financial research. In this work, by applying the Potts dynamic system, a random agent-based financial time series model is developed in an attempt to uncover the empirical laws in finance, where the Potts model is introduced to imitate the trading interactions among the investing agents. Based on the computer simulation in conjunction with the statistical analysis and the nonlinear analysis, we present numerical research to investigate the fluctuation behaviors of the proposed time series model. Furthermore, in order to get a robust conclusion, we consider the daily returns of Shanghai Composite Index and Shenzhen Component Index, and the comparison analysis of return behaviors between the simulation data and the actual data is exhibited.

  4. The Shuttle Mission Simulator computer generated imagery

    NASA Technical Reports Server (NTRS)

    Henderson, T. H.

    1984-01-01

    Equipment available in the primary training facility for the Space Transportation System (STS) flight crews includes the Fixed Base Simulator, the Motion Base Simulator, the Spacelab Simulator, and the Guidance and Navigation Simulator. The Shuttle Mission Simulator (SMS) consists of the Fixed Base Simulator and the Motion Base Simulator. The SMS utilizes four visual Computer Generated Image (CGI) systems. The Motion Base Simulator has a forward crew station with six-degrees of freedom motion simulation. Operation of the Spacelab Simulator is planned for the spring of 1983. The Guidance and Navigation Simulator went into operation in 1982. Aspects of orbital visual simulation are discussed, taking into account the earth scene, payload simulation, the generation and display of 1079 stars, the simulation of sun glare, and Reaction Control System jet firing plumes. Attention is also given to landing site visual simulation, and night launch and landing simulation.

  5. Agent Based Modeling as an Educational Tool

    NASA Astrophysics Data System (ADS)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  6. Development of simulation computer complex specification

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.

  7. Space Ultrareliable Modular Computer (SUMC) instruction simulator

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1972-01-01

    The design principles, description, functional operation, and recommended expansion and enhancements are presented for the Space Ultrareliable Modular Computer interpretive simulator. Included as appendices are the user's manual, program module descriptions, target instruction descriptions, simulator source program listing, and a sample program printout. In discussing the design and operation of the simulator, the key problems involving host computer independence and target computer architectural scope are brought into focus.

  8. Simulating Drosophila Genetics with the Computer.

    ERIC Educational Resources Information Center

    Small, James W., Jr.; Edwards, Kathryn L.

    1979-01-01

    Presents some techniques developed to help improve student understanding of Mendelian principles through the use of a computer simulation model by the genetic system of the fruit fly. Includes discussion and evaluation of this computer assisted program. (MA)

  9. Protocols for Handling Messages Between Simulation Computers

    NASA Technical Reports Server (NTRS)

    Balcerowski, John P.; Dunnam, Milton

    2006-01-01

    Practical Simulator Network (PSimNet) is a set of data-communication protocols designed especially for use in handling messages between computers that are engaging cooperatively in real-time or nearly-real-time training simulations. In a typical application, computers that provide individualized training at widely dispersed locations would communicate, by use of PSimNet, with a central host computer that would provide a common computational- simulation environment and common data. Originally intended for use in supporting interfaces between training computers and computers that simulate the responses of spacecraft scientific payloads, PSimNet could be especially well suited for a variety of other applications -- for example, group automobile-driver training in a classroom. Another potential application might lie in networking of automobile-diagnostic computers at repair facilities to a central computer that would compile the expertise of numerous technicians and engineers and act as an expert consulting technician.

  10. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1982-01-01

    Chip-level modeling techniques in the evaluation of fault tolerant systems were researched. A fault tolerant computer was modeled. An efficient approach to functional fault simulation was developed. Simulation software was also developed.

  11. Monte Carlo Computer Simulation of a Rainbow.

    ERIC Educational Resources Information Center

    Olson, Donald; And Others

    1990-01-01

    Discusses making a computer-simulated rainbow using principles of physics, such as reflection and refraction. Provides BASIC program for the simulation. Appends a program illustrating the effects of dispersion of the colors. (YP)

  12. Adding ecosystem function to agent-based land use models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this paper is to examine issues in the inclusion of simulations of ecosystem functions in agent-based models of land use decision-making. The reasons for incorporating these simulations include local interests in land fertility and global interests in carbon sequestration. Biogeoche...

  13. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data

    PubMed Central

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589

  14. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data.

    PubMed

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589

  15. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  16. Constructivist Design of Graphic Computer Simulations.

    ERIC Educational Resources Information Center

    Black, John B.; And Others

    Two graphic computer simulations have been prepared for teaching high school and middle school students about how business organizations and financial systems work: "Parkside," which simulates managing a hotel; and "Guestwear," which simulates managing a clothing manufacturer. Both simulations are based on six principles of constructivist design…

  17. Computer Simulation in Chemical Kinetics

    ERIC Educational Resources Information Center

    Anderson, Jay Martin

    1976-01-01

    Discusses the use of the System Dynamics technique in simulating a chemical reaction for kinetic analysis. Also discusses the use of simulation modelling in biology, ecology, and the social sciences, where experimentation may be impractical or impossible. (MLH)

  18. Deterministic Agent-Based Path Optimization by Mimicking the Spreading of Ripples.

    PubMed

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Di Paolo, Ezequiel A; Liu, Hao

    2016-01-01

    Inspirations from nature have contributed fundamentally to the development of evolutionary computation. Learning from the natural ripple-spreading phenomenon, this article proposes a novel ripple-spreading algorithm (RSA) for the path optimization problem (POP). In nature, a ripple spreads at a constant speed in all directions, and the node closest to the source is the first to be reached. This very simple principle forms the foundation of the proposed RSA. In contrast to most deterministic top-down centralized path optimization methods, such as Dijkstra's algorithm, the RSA is a bottom-up decentralized agent-based simulation model. Moreover, it is distinguished from other agent-based algorithms, such as genetic algorithms and ant colony optimization, by being a deterministic method that can always guarantee the global optimal solution with very good scalability. Here, the RSA is specifically applied to four different POPs. The comparative simulation results illustrate the advantages of the RSA in terms of effectiveness and efficiency. Thanks to the agent-based and deterministic features, the RSA opens new opportunities to attack some problems, such as calculating the exact complete Pareto front in multiobjective optimization and determining the kth shortest project time in project management, which are very difficult, if not impossible, for existing methods to resolve. The ripple-spreading optimization principle and the new distinguishing features and capacities of the RSA enrich the theoretical foundations of evolutionary computation. PMID:26066805

  19. Computer simulation of CPM dye lasers

    SciTech Connect

    Wang Qingyue; Zhao Xingjun )

    1990-01-01

    Quantative analysis of the laser pulses of various intracavity elements in a CPM dye laser is carried out in this study. The pulse formation is simulated with a computer, resulting in an asymmetric numerical solution for the pulse shape. The mechanisms of pulse formation are also discussed based on the results of computer simulation.

  20. The epitheliome: agent-based modelling of the social behaviour of cells.

    PubMed

    Walker, D C; Southgate, J; Hill, G; Holcombe, M; Hose, D R; Wood, S M; Mac Neil, S; Smallwood, R H

    2004-01-01

    We have developed a new computational modelling paradigm for predicting the emergent behaviour resulting from the interaction of cells in epithelial tissue. As proof-of-concept, an agent-based model, in which there is a one-to-one correspondence between biological cells and software agents, has been coupled to a simple physical model. Behaviour of the computational model is compared with the growth characteristics of epithelial cells in monolayer culture, using growth media with low and physiological calcium concentrations. Results show a qualitative fit between the growth characteristics produced by the simulation and the in vitro cell models. PMID:15351133

  1. VLSI circuit simulation using a vector computer

    NASA Technical Reports Server (NTRS)

    Mcgrogan, S. K.

    1984-01-01

    Simulation of circuits having more than 2000 active devices requires the largest, fastest computers available. A vector computer, such as the CYBER 205, can yield great speed and cost advantages if efforts are made to adapt the simulation program to the strengths of the computer. ASPEC and SPICE (1), two widely used circuit simulation programs, are discussed. ASPECV and VAMOS (5) are respectively vector adaptations of these two simulators. They demonstrate the substantial performance enhancements possible for this class of algorithm on the CYBER 205.

  2. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Kerner, H.; Weatherbee, J. E.; Taylor, D. S.; Hodges, B.

    1973-01-01

    A deterministic simulator is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Its use as a tool to study and determine the minimum computer system configuration necessary to satisfy the on-board computational requirements of a typical mission is presented. The paper describes how the computer system configuration is determined in order to satisfy the data processing demand of the various shuttle booster subsytems. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources.

  3. Analyzing Robotic Kinematics Via Computed Simulations

    NASA Technical Reports Server (NTRS)

    Carnahan, Timothy M.

    1992-01-01

    Computing system assists in evaluation of kinematics of conceptual robot. Displays positions and motions of robotic manipulator within work cell. Also displays interactions between robotic manipulator and other objects. Results of simulation displayed on graphical computer workstation. System includes both off-the-shelf software originally developed for automotive industry and specially developed software. Simulation system also used to design human-equivalent hand, to model optical train in infrared system, and to develop graphical interface for teleoperator simulation system.

  4. Computationally Lightweight Air-Traffic-Control Simulation

    NASA Technical Reports Server (NTRS)

    Knight, Russell

    2005-01-01

    An algorithm for computationally lightweight simulation of automated air traffic control (ATC) at a busy airport has been derived. The algorithm is expected to serve as the basis for development of software that would be incorporated into flight-simulator software, the ATC component of which is not yet capable of handling realistic airport loads. Software based on this algorithm could also be incorporated into other computer programs that simulate a variety of scenarios for purposes of training or amusement.

  5. Computer Clinical Simulations in Health Sciences.

    ERIC Educational Resources Information Center

    Jones, Gary L; Keith, Kenneth D.

    1983-01-01

    Discusses the key characteristics of clinical simulation, some developmental foundations, two current research studies, and some implications for the future of health science education. Investigations of the effects of computer-based simulation indicate that acquisition of decision-making skills is greater than with noncomputerized simulations.…

  6. Architectural considerations for agent-based national scale policy models : LDRD final report.

    SciTech Connect

    Backus, George A.; Strip, David R.

    2007-09-01

    The need to anticipate the consequences of policy decisions becomes ever more important as the magnitude of the potential consequences grows. The multiplicity of connections between the components of society and the economy makes intuitive assessments extremely unreliable. Agent-based modeling has the potential to be a powerful tool in modeling policy impacts. The direct mapping between agents and elements of society and the economy simplify the mapping of real world functions into the world of computation assessment. Our modeling initiative is motivated by the desire to facilitate informed public debate on alternative policies for how we, as a nation, provide healthcare to our population. We explore the implications of this motivation on the design and implementation of a model. We discuss the choice of an agent-based modeling approach and contrast it to micro-simulation and systems dynamics approaches.

  7. Computer simulation of nonequilibrium processes

    SciTech Connect

    Hoover, W.G.; Moran, B.; Holian, B.L.; Posch, H.A.; Bestiale, S.

    1987-01-01

    Recent atomistic simulations of irreversible macroscopic hydrodynamic flows are illustrated. An extension of Nose's reversible atomistic mechanics makes it possible to simulate such non-equilibrium systems with completely reversible equations of motion. The new techniques show that macroscopic irreversibility is a natural inevitable consequence of time-reversible Lyapunov-unstable microscopic equations of motion.

  8. Computational Modeling and Simulation of Genital Tubercle Development

    EPA Science Inventory

    Hypospadias is a developmental defect of urethral tube closure that has a complex etiology. Here, we describe a multicellular agent-based model of genital tubercle development that simulates urethrogenesis from the urethral plate stage to urethral tube closure in differentiating ...

  9. Filtration theory using computer simulations

    SciTech Connect

    Bergman, W.; Corey, I.

    1997-08-01

    We have used commercially available fluid dynamics codes based on Navier-Stokes theory and the Langevin particle equation of motion to compute the particle capture efficiency and pressure drop through selected two- and three-dimensional fiber arrays. The approach we used was to first compute the air velocity vector field throughout a defined region containing the fiber matrix. The particle capture in the fiber matrix is then computed by superimposing the Langevin particle equation of motion over the flow velocity field. Using the Langevin equation combines the particle Brownian motion, inertia and interception mechanisms in a single equation. In contrast, most previous investigations treat the different capture mechanisms separately. We have computed the particle capture efficiency and the pressure drop through one, 2-D and two, 3-D fiber matrix elements. 5 refs., 11 figs.

  10. Evaluation of Visual Computer Simulator for Computer Architecture Education

    ERIC Educational Resources Information Center

    Imai, Yoshiro; Imai, Masatoshi; Moritoh, Yoshio

    2013-01-01

    This paper presents trial evaluation of a visual computer simulator in 2009-2011, which has been developed to play some roles of both instruction facility and learning tool simultaneously. And it illustrates an example of Computer Architecture education for University students and usage of e-Learning tool for Assembly Programming in order to…

  11. Computer simulation of upset welding

    SciTech Connect

    Spingarn, J R; Mason, W E; Swearengen, J C

    1982-04-01

    Useful process modeling of upset welding requires contributions from metallurgy, welding engineering, thermal analysis and experimental mechanics. In this report, the significant milestones for such an effort are outlined and probable difficult areas are pointed out. Progress to date is summarized and directions for future research are offered. With regard to the computational aspects of this problem, a 2-D heat conduction computer code has been modified to incorporate electrical heating, and computations have been run for an axisymmetric problem with simple viscous material laws and d.c. electrical boundary conditions. In the experimental endeavor, the boundary conditions have been measured during the welding process, although interpretation of voltage drop measurements is not straightforward. The ranges of strain, strain rate and temperature encountered during upset welding have been measured or calculated, and the need for a unifying constitutive law is described. Finally, the possible complications of microstructure and interfaces are clarified.

  12. Computational Spectrum of Agent Model Simulation

    SciTech Connect

    Perumalla, Kalyan S

    2010-01-01

    The study of human social behavioral systems is finding renewed interest in military, homeland security and other applications. Simulation is the most generally applied approach to studying complex scenarios in such systems. Here, we outline some of the important considerations that underlie the computational aspects of simulation-based study of human social systems. The fundamental imprecision underlying questions and answers in social science makes it necessary to carefully distinguish among different simulation problem classes and to identify the most pertinent set of computational dimensions associated with those classes. We identify a few such classes and present their computational implications. The focus is then shifted to the most challenging combinations in the computational spectrum, namely, large-scale entity counts at moderate to high levels of fidelity. Recent developments in furthering the state-of-the-art in these challenging cases are outlined. A case study of large-scale agent simulation is provided in simulating large numbers (millions) of social entities at real-time speeds on inexpensive hardware. Recent computational results are identified that highlight the potential of modern high-end computing platforms to push the envelope with respect to speed, scale and fidelity of social system simulations. Finally, the problem of shielding the modeler or domain expert from the complex computational aspects is discussed and a few potential solution approaches are identified.

  13. Applications of Agent Based Approaches in Business (A Three Essay Dissertation)

    ERIC Educational Resources Information Center

    Prawesh, Shankar

    2013-01-01

    The goal of this dissertation is to investigate the enabling role that agent based simulation plays in business and policy. The aforementioned issue has been addressed in this dissertation through three distinct, but related essays. The first essay is a literature review of different research applications of agent based simulation in various…

  14. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  15. Astronomy Simulation with Computer Graphics.

    ERIC Educational Resources Information Center

    Thomas, William E.

    1982-01-01

    "Planetary Motion Simulations" is a system of programs designed for students to observe motions of a superior planet (one whose orbit lies outside the orbit of the earth). Programs run on the Apple II microcomputer and employ high-resolution graphics to present the motions of Saturn. (Author/JN)

  16. Computer simulation of engine systems

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1980-01-01

    The use of computerized simulations of the steady state and transient performance of jet engines throughout the flight regime is discussed. In addition, installation effects on thrust and specific fuel consumption is accounted for as well as engine weight, dimensions and cost. The availability throughout the government and industry of analytical methods for calculating these quantities are pointed out.

  17. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  18. Augmented Reality Simulations on Handheld Computers

    ERIC Educational Resources Information Center

    Squire, Kurt; Klopfer, Eric

    2007-01-01

    Advancements in handheld computing, particularly its portability, social interactivity, context sensitivity, connectivity, and individuality, open new opportunities for immersive learning environments. This article articulates the pedagogical potential of augmented reality simulations in environmental engineering education by immersing students in…

  19. Computer Simulation of F=m/a.

    ERIC Educational Resources Information Center

    Hayden, Howard C.

    1984-01-01

    Discusses a computer simulation which: (1) describes an experiment investigating F=m/a; (2) generates data; (3) allows students to see the data; and (4) generates the equation with a least-squares fit. (JN)

  20. Computer simulation and optimization of radioelectronic devices

    NASA Astrophysics Data System (ADS)

    Benenson, Z. M.; Elistratov, M. R.; Ilin, L. K.; Kravchenko, S. V.; Sukhov, D. M.; Udler, M. A.

    Methods of simulating and optimizing radioelectronic devices in an automated design system are discussed. Also treated are algorithms used in the computer-aided design of these devices. The special language of description for these devices is described.

  1. Computer-simulated phacoemulsification improvements

    NASA Astrophysics Data System (ADS)

    Soederberg, Per G.; Laurell, Carl-Gustaf; Artzen, D.; Nordh, Leif; Skarman, Eva; Nordqvist, P.; Andersson, Mats

    2002-06-01

    A simulator for phacoemulsification cataract extraction is developed. A three-dimensional visual interface and foot pedals for phacoemulsification power, x-y positioning, zoom and focus were established. An algorithm that allows real time visual feedback of the surgical field was developed. Cataract surgery is the most common surgical procedure. The operation requires input from both feet and both hands and provides visual feedback through the operation microscope essentially without tactile feedback. Experience demonstrates that the number of complications for an experienced surgeon learning phacoemulsification, decreases exponentially, reaching close to the asymptote after the first 500 procedures despite initial wet lab training on animal eyes. Simulator training is anticipated to decrease training time, decrease complication rate for the beginner and reduce expensive supervision by a high volume surgeon.

  2. Teaching Environmental Systems Modelling Using Computer Simulation.

    ERIC Educational Resources Information Center

    Moffatt, Ian

    1986-01-01

    A computer modeling course in environmental systems and dynamics is presented. The course teaches senior undergraduates to analyze a system of interest, construct a system flow chart, and write computer programs to simulate real world environmental processes. An example is presented along with a course evaluation, figures, tables, and references.…

  3. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  4. Computer Simulation Of A Small Turboshaft Engine

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.

    1991-01-01

    Component-type mathematical model of small turboshaft engine developed for use in real-time computer simulations of dynamics of helicopter flight. Yields shaft speeds, torques, fuel-consumption rates, and other operating parameters with sufficient accuracy for use in real-time simulation of maneuvers involving large transients in power and/or severe accelerations.

  5. Simulations of Probabilities for Quantum Computing

    NASA Technical Reports Server (NTRS)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  6. Criterion Standards for Evaluating Computer Simulation Courseware.

    ERIC Educational Resources Information Center

    Wholeben, Brent Edward

    This paper explores the role of computerized simulations as a decision-modeling intervention strategy, and views the strategy's different attribute biases based upon the varying primary missions of instruction versus application. The common goals associated with computer simulations as a training technique are discussed and compared with goals of…

  7. Salesperson Ethics: An Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  8. Computer simulation of gear tooth manufacturing processes

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri; Huston, Ronald L.

    1990-01-01

    The use of computer graphics to simulate gear tooth manufacturing procedures is discussed. An analytical basis for the simulation is established for spur gears. The simulation itself, however, is developed not only for spur gears, but for straight bevel gears as well. The applications of the developed procedure extend from the development of finite element models of heretofore intractable geometrical forms, to exploring the fabrication of nonstandard tooth forms.

  9. Optimum spaceborne computer system design by simulation

    NASA Technical Reports Server (NTRS)

    Williams, T.; Weatherbee, J. E.; Taylor, D. S.

    1972-01-01

    A deterministic digital simulation model is described which models the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. Use of the model as a tool in configuring a minimum computer system for a typical mission is demonstrated. The configuration which is developed as a result of studies with the simulator is optimal with respect to the efficient use of computer system resources, i.e., the configuration derived is a minimal one. Other considerations such as increased reliability through the use of standby spares would be taken into account in the definition of a practical system for a given mission.

  10. Computer simulation of bubble formation.

    SciTech Connect

    Insepov, Z.; Bazhirov, T.; Norman, G.; Stegailov, V.; Mathematics and Computer Science; Institute for High Energy Densities of Joint Institute for High Temperatures of RAS

    2007-01-01

    Properties of liquid metals (Li, Pb, Na) containing nanoscale cavities were studied by atomistic Molecular Dynamics (MD). Two atomistic models of cavity simulation were developed that cover a wide area in the phase diagram with negative pressure. In the first model, the thermodynamics of cavity formation, stability and the dynamics of cavity evolution in bulk liquid metals have been studied. Radial densities, pressures, surface tensions, and work functions of nano-scale cavities of various radii were calculated for liquid Li, Na, and Pb at various temperatures and densities, and at small negative pressures near the liquid-gas spinodal, and the work functions for cavity formation in liquid Li were calculated and compared with the available experimental data. The cavitation rate can further be obtained by using the classical nucleation theory (CNT). The second model is based on the stability study and on the kinetics of cavitation of the stretched liquid metals. A MD method was used to simulate cavitation in a metastable Pb and Li melts and determine the stability limits. States at temperatures below critical (T < 0.5Tc) and large negative pressures were considered. The kinetic boundary of liquid phase stability was shown to be different from the spinodal. The kinetics and dynamics of cavitation were studied. The pressure dependences of cavitation frequencies were obtained for several temperatures. The results of MD calculations were compared with estimates based on classical nucleation theory.

  11. Computer Code for Nanostructure Simulation

    NASA Technical Reports Server (NTRS)

    Filikhin, Igor; Vlahovic, Branislav

    2009-01-01

    Due to their small size, nanostructures can have stress and thermal gradients that are larger than any macroscopic analogue. These gradients can lead to specific regions that are susceptible to failure via processes such as plastic deformation by dislocation emission, chemical debonding, and interfacial alloying. A program has been developed that rigorously simulates and predicts optoelectronic properties of nanostructures of virtually any geometrical complexity and material composition. It can be used in simulations of energy level structure, wave functions, density of states of spatially configured phonon-coupled electrons, excitons in quantum dots, quantum rings, quantum ring complexes, and more. The code can be used to calculate stress distributions and thermal transport properties for a variety of nanostructures and interfaces, transport and scattering at nanoscale interfaces and surfaces under various stress states, and alloy compositional gradients. The code allows users to perform modeling of charge transport processes through quantum-dot (QD) arrays as functions of inter-dot distance, array order versus disorder, QD orientation, shape, size, and chemical composition for applications in photovoltaics and physical properties of QD-based biochemical sensors. The code can be used to study the hot exciton formation/relation dynamics in arrays of QDs of different shapes and sizes at different temperatures. It also can be used to understand the relation among the deposition parameters and inherent stresses, strain deformation, heat flow, and failure of nanostructures.

  12. Pseudospark discharges via computer simulation

    SciTech Connect

    Boeuf, J.P.; Pitchford, L.C. )

    1991-04-01

    The authors of this paper developed a hybrid fluid-particle (Monte Carlo) model to describe the initiation phase of pseudospark discharges. In this model, time-dependent fluid equations for the electrons and positive ions are solved self-consistently with Poisson's equation for the electric field in a two-dimensional, cylindrically symmetrical geometry. The Monte Carlo simulation is used to determine the ionization source term in the fluid equations. This model has been used to study the evolution of a discharge in helium at 0.5 torr, with an applied voltage of 2 kV and in a typical pseudospark geometry. From the numerical results, the authors have identified a sequence of physical events that lead to the rapid rise in current associated with the onset of the pseudospark discharge mode. For the conditions the authors have simulated, they find that there is a maximum in the electron multiplication at the time which corresponds to the onset of the hollow cathode effect, and although the multiplication later decreases, it is always greater than needed for a steady-state discharge. Thus the sheaths inside the hollow cathode tend to collapse against the walls, and eventually cathode emission mechanisms (such as field-enhanced thermionic emission) which the authors have not included will start to play a role. In spite of the approximation in this model, the picture which has emerged provides insight into the mechanisms controlling the onset of this potentially important discharge mode.

  13. Rotorcraft Damage Tolerance Evaluated by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon; Abdi, Frank

    2000-01-01

    An integrally stiffened graphite/epoxy composite rotorcraft structure is evaluated via computational simulation. A computer code that scales up constituent micromechanics level material properties to the structure level and accounts for all possible failure modes is used for the simulation of composite degradation under loading. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulation. Design implications with regard to defect and damage tolerance of integrally stiffened composite structures are examined. A procedure is outlined regarding the use of this type of information for setting quality acceptance criteria, design allowables, damage tolerance, and retirement-for-cause criteria.

  14. Cluster computing software for GATE simulations

    SciTech Connect

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-06-15

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values.

  15. Cluster computing software for GATE simulations.

    PubMed

    De Beenhouwer, Jan; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R

    2007-06-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values. PMID:17654895

  16. Polymer Composites Corrosive Degradation: A Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2007-01-01

    A computational simulation of polymer composites corrosive durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured pH factor and is represented by voids, temperature and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  17. Computer simulations of lung surfactant.

    PubMed

    Baoukina, Svetlana; Tieleman, D Peter

    2016-10-01

    Lung surfactant lines the gas-exchange interface in the lungs and reduces the surface tension, which is necessary for breathing. Lung surfactant consists mainly of lipids with a small amount of proteins and forms a monolayer at the air-water interface connected to bilayer reservoirs. Lung surfactant function involves transfer of material between the monolayer and bilayers during the breathing cycle. Lipids and proteins are organized laterally in the monolayer; selected species are possibly preferentially transferred to bilayers. The complex 3D structure of lung surfactant and the exact roles of lipid organization and proteins remain important goals for research. We review recent simulation studies on the properties of lipid monolayers, monolayers with phase coexistence, monolayer-bilayer transformations, lipid-protein interactions, and effects of nanoparticles on lung surfactant. This article is part of a Special Issue entitled: Biosimulations edited by Ilpo Vattulainen and Tomasz Róg. PMID:26922885

  18. Creating science simulations through Computational Thinking Patterns

    NASA Astrophysics Data System (ADS)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  19. Agent-based scheduling system to achieve agility

    NASA Astrophysics Data System (ADS)

    Akbulut, Muhtar B.; Kamarthi, Sagar V.

    2000-12-01

    Today's competitive enterprises need to design, develop, and manufacture their products rapidly and inexpensively. Agile manufacturing has emerged as a new paradigm to meet these challenges. Agility requires, among many other things, scheduling and control software systems that are flexible, robust, and adaptive. In this paper a new agent-based scheduling system (ABBS) is developed to meet the challenges of an agile manufacturing system. In ABSS, unlike in the traditional approaches, information and decision making capabilities are distributed among the system entities called agents. In contrast with the most agent-based scheduling systems which commonly use a bidding approach, the ABBS employs a global performance monitoring strategy. A production-rate-based global performance metric which effectively assesses the system performance is developed to assist the agents' decision making process. To test the architecture, an agent-based discrete event simulation software is developed. The experiments performed using the simulation software yielded encouraging results in supporting the applicability of agent-based systems to address the scheduling and control needs of an agile manufacturing system.

  20. Computer simulation of space station computer steered high gain antenna

    NASA Technical Reports Server (NTRS)

    Beach, S. W.

    1973-01-01

    The mathematical modeling and programming of a complete simulation program for a space station computer-steered high gain antenna are described. The program provides for reading input data cards, numerically integrating up to 50 first order differential equations, and monitoring up to 48 variables on printed output and on plots. The program system consists of a high gain antenna, an antenna gimbal control system, an on board computer, and the environment in which all are to operate.

  1. Computational modeling and simulation of genital tubercle development.

    PubMed

    Leung, Maxwell C K; Hutson, M Shane; Seifert, Ashley W; Spencer, Richard M; Knudsen, Thomas B

    2016-09-01

    Hypospadias is a developmental defect of urethral tube closure that has a complex etiology involving genetic and environmental factors, including anti-androgenic and estrogenic disrupting chemicals; however, little is known about the morphoregulatory consequences of androgen/estrogen balance during genital tubercle (GT) development. Computer models that predictively model sexual dimorphism of the GT may provide a useful resource to translate chemical-target bipartite networks and their developmental consequences across the human-relevant chemical universe. Here, we describe a multicellular agent-based model of genital tubercle (GT) development that simulates urethrogenesis from the sexually-indifferent urethral plate stage to urethral tube closure. The prototype model, constructed in CompuCell3D, recapitulates key aspects of GT morphogenesis controlled by SHH, FGF10, and androgen pathways through modulation of stochastic cell behaviors, including differential adhesion, motility, proliferation, and apoptosis. Proper urethral tube closure in the model was shown to depend quantitatively on SHH- and FGF10-induced effects on mesenchymal proliferation and epithelial apoptosis-both ultimately linked to androgen signaling. In the absence of androgen, GT development was feminized and with partial androgen deficiency, the model resolved with incomplete urethral tube closure, thereby providing an in silico platform for probabilistic prediction of hypospadias risk across combinations of minor perturbations to the GT system at various stages of embryonic development. PMID:27180093

  2. Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The Virtual National Airspace Simulation (VNAS) will improve the safety of Air Transportation. In 2001, using simulation and information management software running over a distributed network of super-computers, researchers at NASA Ames, Glenn, and Langley Research Centers developed a working prototype of a virtual airspace. This VNAS prototype modeled daily operations of the Atlanta airport by integrating measured operational data and simulation data on up to 2,000 flights a day. The concepts and architecture developed by NASA for this prototype are integral to the National Airspace Simulation to support the development of strategies improving aviation safety, identifying precursors to component failure.

  3. Symbolic computation in system simulation and design

    NASA Astrophysics Data System (ADS)

    Evans, Brian L.; Gu, Steve X.; Kalavade, Asa; Lee, Edward A.

    1995-06-01

    This paper examines some of the roles that symbolic computation plays in assisting system- level simulation and design. By symbolic computation, we mean programs like Mathematica that perform symbolic algebra and apply transformation rules based on algebraic identities. At a behavioral level, symbolic computation can compute parameters, generate new models, and optimize parameter settings. At the synthesis level, symbolic computation can work in tandem with synthesis tools to rewrite cascade and parallel combinations on components in sub- systems to meet design constraints. Symbolic computation represents one type of tool that may be invoked in the complex flow of the system design process. The paper discusses the qualities that a formal infrastructure for managing system design should have. The paper also describes an implementation of this infrastructure called DesignMaker, implemented in the Ptolemy environment, which manages the flow of tool invocations in an efficient manner using a graphical file dependency mechanism.

  4. Computer Series, 108. Computer Simulation of Chemical Equilibrium.

    ERIC Educational Resources Information Center

    Cullen, John F., Jr.

    1989-01-01

    Presented is a computer simulation called "The Great Chemical Bead Game" which can be used to teach the concepts of equilibrium and kinetics to introductory chemistry students more clearly than through an experiment. Discussed are the rules of the game, the application of rate laws and graphical analysis. (CW)

  5. Enabling Computational Technologies for Terascale Scientific Simulations

    SciTech Connect

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  6. Computer simulation of breathing systems for divers

    SciTech Connect

    Sexton, P.G.; Nuckols, M.L.

    1983-02-01

    A powerful new tool for the analysis and design of underwater breathing gas systems is being developed. A versatile computer simulator is described which makes possible the modular ''construction'' of any conceivable breathing gas system from computer memory-resident components. The analysis of a typical breathing gas system is demonstrated using this simulation technique, and the effects of system modifications on performance of the breathing system are shown. This modeling technique will ultimately serve as the foundation for a proposed breathing system simulator under development by the Navy. The marriage of this computer modeling technique with an interactive graphics system will provide the designer with an efficient, cost-effective tool for the development of new and improved diving systems.

  7. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  8. Structural Composites Corrosive Management by Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Minnetyan, Levon

    2006-01-01

    A simulation of corrosive management on polymer composites durability is presented. The corrosive environment is assumed to manage the polymer composite degradation on a ply-by-ply basis. The degradation is correlated with a measured Ph factor and is represented by voids, temperature, and moisture which vary parabolically for voids and linearly for temperature and moisture through the laminate thickness. The simulation is performed by a computational composite mechanics computer code which includes micro, macro, combined stress failure, and laminate theories. This accounts for starting the simulation from constitutive material properties and up to the laminate scale which exposes the laminate to the corrosive environment. Results obtained for one laminate indicate that the ply-by-ply managed degradation degrades the laminate to the last one or the last several plies. Results also demonstrate that the simulation is applicable to other polymer composite systems as well.

  9. Learning features in computer simulation skills training.

    PubMed

    Johannesson, Eva; Olsson, Mats; Petersson, Göran; Silén, Charlotte

    2010-09-01

    New simulation tools imply new opportunities to teach skills and train health care professionals. The aim of this study was to investigate the learning gained from computer simulation skills training. The study was designed for optimal educational settings, which benefit student-centred learning. Twenty-four second year undergraduate nursing students practised intravenous catheterization with the computer simulation program CathSim. Questionnaires were answered before and after the skills training, and after the skills examination. When using CathSim, the students appreciated the variation in patient cases, the immediate feedback, and a better understanding of anatomy, but they missed having an arm model to hold. We concluded that CathSim was useful in the students' learning process and skills training when appropriately integrated into the curriculum. Learning features to be aware of when organizing curricula with simulators are motivation, realism, variation, meaningfulness and feedback. PMID:20015690

  10. Agent-based services for B2B electronic commerce

    NASA Astrophysics Data System (ADS)

    Fong, Elizabeth; Ivezic, Nenad; Rhodes, Tom; Peng, Yun

    2000-12-01

    The potential of agent-based systems has not been realized yet, in part, because of the lack of understanding of how the agent technology supports industrial needs and emerging standards. The area of business-to-business electronic commerce (b2b e-commerce) is one of the most rapidly developing sectors of industry with huge impact on manufacturing practices. In this paper, we investigate the current state of agent technology and the feasibility of applying agent-based computing to b2b e-commerce in the circuit board manufacturing sector. We identify critical tasks and opportunities in the b2b e-commerce area where agent-based services can best be deployed. We describe an implemented agent-based prototype system to facilitate the bidding process for printed circuit board manufacturing and assembly. These activities are taking place within the Internet Commerce for Manufacturing (ICM) project, the NIST- sponsored project working with industry to create an environment where small manufacturers of mechanical and electronic components may participate competitively in virtual enterprises that manufacture printed circuit assemblies.

  11. Task simulation in computer-based training

    SciTech Connect

    Gardner, P.R.

    1988-02-01

    Westinghouse Hanford Company (WHC) makes extensive use of job-task simulations in company-developed computer-based training (CBT) courseware. This courseware is different from most others because it does not simulate process control machinery or other computer programs, instead the WHC Excerises model day-to-day tasks such as physical work preparations, progress, and incident handling. These Exercises provide a higher level of motivation and enable the testing of more complex patterns of behavior than those typically measured by multiple-choice and short questions. Examples from the WHC Radiation Safety and Crane Safety courses will be used as illustrations. 3 refs.

  12. Student Choices when Learning with Computer Simulations

    NASA Astrophysics Data System (ADS)

    Podolefsky, Noah S.; Adams, Wendy K.; Wieman, Carl E.

    2009-11-01

    We examine student choices while using PhET computer simulations (sims) to learn physics content. In interviews, students were given questions from the Force Concept Inventory (FCI) and were allowed to choose from 12 computer simulations in order to answer these questions. We investigate students' choices when answering FCI questions with sims. We find that while students' initially choose sims that match problem situations at a surface level, deeper connections may be noticed by students later on. These results inform us on how people may choose education resources when learning on their own.

  13. Computer simulation: A modern day crystal ball?

    NASA Technical Reports Server (NTRS)

    Sham, Michael; Siprelle, Andrew

    1994-01-01

    It has long been the desire of managers to be able to look into the future and predict the outcome of decisions. With the advent of computer simulation and the tremendous capability provided by personal computers, that desire can now be realized. This paper presents an overview of computer simulation and modeling, and discusses the capabilities of Extend. Extend is an iconic-driven Macintosh-based software tool that brings the power of simulation to the average computer user. An example of an Extend based model is presented in the form of the Space Transportation System (STS) Processing Model. The STS Processing Model produces eight shuttle launches per year, yet it takes only about ten minutes to run. In addition, statistical data such as facility utilization, wait times, and processing bottlenecks are produced. The addition or deletion of resources, such as orbiters or facilities, can be easily modeled and their impact analyzed. Through the use of computer simulation, it is possible to look into the future to see the impact of today's decisions.

  14. Virtual ambulatory care. Computer simulation applications.

    PubMed

    Zilm, Frank; Culp, Kristyna; Dorney, Beverley

    2003-01-01

    Computer simulation modeling has evolved during the past twenty years into an effective tool for analyzing and planning ambulatory care facilities. This article explains the use of this tool in three case-study, ambulatory care settings--a GI lab, holding beds for a cardiac catheterization laboratory, and in emergency services. These examples also illustrate the use of three software packages currently available: MedModel, Simul8, and WITNESS. PMID:12545512

  15. Perspective: Computer simulations of long time dynamics.

    PubMed

    Elber, Ron

    2016-02-14

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  16. Perspective: Computer simulations of long time dynamics

    PubMed Central

    Elber, Ron

    2016-01-01

    Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances. PMID:26874473

  17. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  18. Simulating physical phenomena with a quantum computer

    NASA Astrophysics Data System (ADS)

    Ortiz, Gerardo

    2003-03-01

    In a keynote speech at MIT in 1981 Richard Feynman raised some provocative questions in connection to the exact simulation of physical systems using a special device named a ``quantum computer'' (QC). At the time it was known that deterministic simulations of quantum phenomena in classical computers required a number of resources that scaled exponentially with the number of degrees of freedom, and also that the probabilistic simulation of certain quantum problems were limited by the so-called sign or phase problem, a problem believed to be of exponential complexity. Such a QC was intended to mimick physical processes exactly the same as Nature. Certainly, remarks coming from such an influential figure generated widespread interest in these ideas, and today after 21 years there are still some open questions. What kind of physical phenomena can be simulated with a QC?, How?, and What are its limitations? Addressing and attempting to answer these questions is what this talk is about. Definitively, the goal of physics simulation using controllable quantum systems (``physics imitation'') is to exploit quantum laws to advantage, and thus accomplish efficient imitation. Fundamental is the connection between a quantum computational model and a physical system by transformations of operator algebras. This concept is a necessary one because in Quantum Mechanics each physical system is naturally associated with a language of operators and thus can be considered as a possible model of quantum computation. The remarkable result is that an arbitrary physical system is naturally simulatable by another physical system (or QC) whenever a ``dictionary'' between the two operator algebras exists. I will explain these concepts and address some of Feynman's concerns regarding the simulation of fermionic systems. Finally, I will illustrate the main ideas by imitating simple physical phenomena borrowed from condensed matter physics using quantum algorithms, and present experimental

  19. Quantitative computer simulations of extraterrestrial processing operations

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Nikravesh, P. E.

    1989-01-01

    The automation of a small, solid propellant mixer was studied. Temperature control is under investigation. A numerical simulation of the system is under development and will be tested using different control options. Control system hardware is currently being put into place. The construction of mathematical models and simulation techniques for understanding various engineering processes is also studied. Computer graphics packages were utilized for better visualization of the simulation results. The mechanical mixing of propellants is examined. Simulation of the mixing process is being done to study how one can control for chaotic behavior to meet specified mixing requirements. An experimental mixing chamber is also being built. It will allow visual tracking of particles under mixing. The experimental unit will be used to test ideas from chaos theory, as well as to verify simulation results. This project has applications to extraterrestrial propellant quality and reliability.

  20. Agent-based Modeling with MATSim for Hazards Evacuation Planning

    NASA Astrophysics Data System (ADS)

    Jones, J. M.; Ng, P.; Henry, K.; Peters, J.; Wood, N. J.

    2015-12-01

    Hazard evacuation planning requires robust modeling tools and techniques, such as least cost distance or agent-based modeling, to gain an understanding of a community's potential to reach safety before event (e.g. tsunami) arrival. Least cost distance modeling provides a static view of the evacuation landscape with an estimate of travel times to safety from each location in the hazard space. With this information, practitioners can assess a community's overall ability for timely evacuation. More information may be needed if evacuee congestion creates bottlenecks in the flow patterns. Dynamic movement patterns are best explored with agent-based models that simulate movement of and interaction between individual agents as evacuees through the hazard space, reacting to potential congestion areas along the evacuation route. The multi-agent transport simulation model MATSim is an agent-based modeling framework that can be applied to hazard evacuation planning. Developed jointly by universities in Switzerland and Germany, MATSim is open-source software written in Java and freely available for modification or enhancement. We successfully used MATSim to illustrate tsunami evacuation challenges in two island communities in California, USA, that are impacted by limited escape routes. However, working with MATSim's data preparation, simulation, and visualization modules in an integrated development environment requires a significant investment of time to develop the software expertise to link the modules and run a simulation. To facilitate our evacuation research, we packaged the MATSim modules into a single application tailored to the needs of the hazards community. By exposing the modeling parameters of interest to researchers in an intuitive user interface and hiding the software complexities, we bring agent-based modeling closer to practitioners and provide access to the powerful visual and analytic information that this modeling can provide.

  1. Computer simulation of screw dislocation in aluminum

    NASA Technical Reports Server (NTRS)

    Esterling, D. M.

    1976-01-01

    The atomic structure in a 110 screw dislocation core for aluminum is obtained by computer simulation. The lattice statics technique is employed since it entails no artificially imposed elastic boundary around the defect. The interatomic potential has no adjustable parameters and was derived from pseudopotential theory. The resulting atomic displacements were allowed to relax in all three dimensions.

  2. Eliminating Computational Instability In Multibody Simulations

    NASA Technical Reports Server (NTRS)

    Watts, Gaines L.

    1994-01-01

    TWOBODY implements improved version of Lagrange multiplier method. Program ultilizes programming technique eliminating computational instability in multibody simulations in which Lagrange multipliers used. In technique, one uses constraint equations, instead of integration, to determine coordinates that are not independent. To illustrate technique, it includes simple mathematical model of solid rocket booster and parachute connected by frictionless swivel. Written in FORTRAN 77.

  3. Macromod: Computer Simulation For Introductory Economics

    ERIC Educational Resources Information Center

    Ross, Thomas

    1977-01-01

    The Macroeconomic model (Macromod) is a computer assisted instruction simulation model designed for introductory economics courses. An evaluation of its utilization at a community college indicates that it yielded a 10 percent to 13 percent greater economic comprehension than lecture classes and that it met with high student approval. (DC)

  4. Designing Online Scaffolds for Interactive Computer Simulation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wu, I-Chia; Jen, Fen-Lan

    2013-01-01

    The purpose of this study was to examine the effectiveness of online scaffolds in computer simulation to facilitate students' science learning. We first introduced online scaffolds to assist and model students' science learning and to demonstrate how a system embedded with online scaffolds can be designed and implemented to help high…

  5. Assessing Moderator Variables: Two Computer Simulation Studies.

    ERIC Educational Resources Information Center

    Mason, Craig A.; And Others

    1996-01-01

    A strategy is proposed for conceptualizing moderating relationships based on their type (strictly correlational and classically correlational) and form, whether continuous, noncontinuous, logistic, or quantum. Results of computer simulations comparing three statistical approaches for assessing moderator variables are presented, and advantages of…

  6. Decision Making in Computer-Simulated Experiments.

    ERIC Educational Resources Information Center

    Suits, J. P.; Lagowski, J. J.

    A set of interactive, computer-simulated experiments was designed to respond to the large range of individual differences in aptitude and reasoning ability generally exhibited by students enrolled in first-semester general chemistry. These experiments give students direct experience in the type of decision making needed in an experimental setting.…

  7. Progress in Computational Simulation of Earthquakes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  8. Computation applied to particle accelerator simulations

    SciTech Connect

    Herrmannsfeldt, W.B. ); Yan, Y.T. )

    1991-07-01

    The rapid growth in the power of large-scale computers has had a revolutionary effect on the study of charged-particle accelerators that is similar to the impact of smaller computers on everyday life. Before an accelerator is built, it is now the absolute rule to simulate every component and subsystem by computer to establish modes of operation and tolerances. We will bypass the important and fruitful areas of control and operation and consider only application to design and diagnostic interpretation. Applications of computers can be divided into separate categories including: component design, system design, stability studies, cost optimization, and operating condition simulation. For the purposes of this report, we will choose a few examples taken from the above categories to illustrate the methods and we will discuss the significance of the work to the project, and also briefly discuss the accelerator project itself. The examples that will be discussed are: (1) the tracking analysis done for the main ring of the Superconducting Supercollider, which contributed to the analysis which ultimately resulted in changing the dipole coil diameter to 5 cm from the earlier design for a 4-cm coil-diameter dipole magnet; (2) the design of accelerator structures for electron-positron linear colliders and circular colliding beam systems (B-factories); (3) simulation of the wake fields from multibunch electron beams for linear colliders; and (4) particle-in-cell simulation of space-charge dominated beams for an experimental liner induction accelerator for Heavy Ion Fusion. 8 refs., 9 figs.

  9. Factors Promoting Engaged Exploration with Computer Simulations

    ERIC Educational Resources Information Center

    Podolefsky, Noah S.; Perkins, Katherine K.; Adams, Wendy K.

    2010-01-01

    This paper extends prior research on student use of computer simulations (sims) to engage with and explore science topics, in this case wave interference. We describe engaged exploration; a process that involves students actively interacting with educational materials, sense making, and exploring primarily via their own questioning. We analyze…

  10. Spiking network simulation code for petascale computers.

    PubMed

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  11. Spiking network simulation code for petascale computers

    PubMed Central

    Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz

    2014-01-01

    Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682

  12. Computer simulations of WIGWAM underwater experiment

    SciTech Connect

    Kamegai, Minao; White, J.W.

    1993-11-01

    We performed computer simulations of the WIGWAM underwater experiment with a 2-D hydro-code, CALE. First, we calculated the bubble pulse and the signal strength at the closest gauge in one-dimensional geometry. The calculation shows excellent agreement with the measured data. Next, we made two-dimensional simulations of WIGWAM applying the gravity over-pressure, and calculated the signals at three selected gauge locations where measurements were recorded. The computed peak pressures at those gauge locations come well within the 15% experimental error bars. The signal at the farthest gauge is of the order of 200 bars. This is significant, because at this pressure the CALE output can be linked to a hydro-acoustics computer program, NPE Code (Nonlinear Progressive Wave-equation Code), to analyze the long distance propagation of acoustical signals from the underwater explosions on a global scale.

  13. Computer simulation of underwater nuclear effects

    SciTech Connect

    Kamegai, M.

    1987-01-30

    We investigated underwater nuclear effects by computer simulations. First, we computed a long distance wave propagation in water by the 1-D LASNEX code by modeling the energy source and the underwater environment. The pressure-distance data were calculated for two quite different yields; pressures range from 300 GPa to 15 MPa. They were found to be in good agreement with Snay's theoretical points and the Wigwam measurements. The computed data also agree with the similarity solution at high pressures and the empirical equation at low pressures. After completion of the 1-D study, we investigated a free surface effect commonly referred to as irregular surface rarefaction by applying two hydrocodes (LASNEX and ALE), linked at the appropriate time. Using these codes, we simulated near-surface explosions for three depths of burst (3 m, 21 m and 66.5 m), which represent the strong, intermediate, and weak surface shocks, respectively.

  14. Computational algorithms for simulations in atmospheric optics.

    PubMed

    Konyaev, P A; Lukin, V P

    2016-04-20

    A computer simulation technique for atmospheric and adaptive optics based on parallel programing is discussed. A parallel propagation algorithm is designed and a modified spectral-phase method for computer generation of 2D time-variant random fields is developed. Temporal power spectra of Laguerre-Gaussian beam fluctuations are considered as an example to illustrate the applications discussed. Implementation of the proposed algorithms using Intel MKL and IPP libraries and NVIDIA CUDA technology is shown to be very fast and accurate. The hardware system for the computer simulation is an off-the-shelf desktop with an Intel Core i7-4790K CPU operating at a turbo-speed frequency up to 5 GHz and an NVIDIA GeForce GTX-960 graphics accelerator with 1024 1.5 GHz processors. PMID:27140113

  15. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1983-01-01

    Adequate computer methods, based on interactions between discrete particles, provide information leading to an atomic level understanding of various physical processes. The success of these simulation methods, however, is related to the accuracy of the potential energy function representing the interactions among the particles. The development of a potential energy function for crystalline SiO2 forms that can be employed in lengthy computer modelling procedures was investigated. In many of the simulation methods which deal with discrete particles, semiempirical two body potentials were employed to analyze energy and structure related properties of the system. Many body interactions are required for a proper representation of the total energy for many systems. Many body interactions for simulations based on discrete particles are discussed.

  16. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon

    2004-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  17. Computational Simulation of Composite Structural Fatigue

    NASA Technical Reports Server (NTRS)

    Minnetyan, Levon; Chamis, Christos C. (Technical Monitor)

    2005-01-01

    Progressive damage and fracture of composite structures subjected to monotonically increasing static, tension-tension cyclic, pressurization, and flexural cyclic loading are evaluated via computational simulation. Constituent material properties, stress and strain limits are scaled up to the structure level to evaluate the overall damage and fracture propagation for composites. Damage initiation, growth, accumulation, and propagation to fracture due to monotonically increasing static and cyclic loads are included in the simulations. Results show the number of cycles to failure at different temperatures and the damage progression sequence during different degradation stages. A procedure is outlined for use of computational simulation data in the assessment of damage tolerance, determination of sensitive parameters affecting fracture, and interpretation of results with insight for design decisions.

  18. Agent-based model for the h-index - exact solution

    NASA Astrophysics Data System (ADS)

    Żogała-Siudem, Barbara; Siudem, Grzegorz; Cena, Anna; Gagolewski, Marek

    2016-01-01

    Hirsch's h-index is perhaps the most popular citation-based measure of scientific excellence. In 2013, Ionescu and Chopard proposed an agent-based model describing a process for generating publications and citations in an abstract scientific community [G. Ionescu, B. Chopard, Eur. Phys. J. B 86, 426 (2013)]. Within such a framework, one may simulate a scientist's activity, and - by extension - investigate the whole community of researchers. Even though the Ionescu and Chopard model predicts the h-index quite well, the authors provided a solution based solely on simulations. In this paper, we complete their results with exact, analytic formulas. What is more, by considering a simplified version of the Ionescu-Chopard model, we obtained a compact, easy to compute formula for the h-index. The derived approximate and exact solutions are investigated on a simulated and real-world data sets.

  19. Computer simulation of underwater nuclear events

    SciTech Connect

    Kamegai, M.

    1986-09-01

    This report describes the computer simulation of two underwater nuclear explosions, Operation Wigwam and a modern hypothetical explosion of greater yield. The computer simulations were done in spherical geometry with the LASNEX computer code. Comparison of the LASNEX calculation with Snay's analytical results and the Wigwam measurements shows that agreement in the shock pressure versus range in water is better than 5%. The results of the calculations are also consistent with the cube root scaling law for an underwater blast wave. The time constant of the wave front was determined from the wave profiles taken at several points. The LASNEX time-constant calculation and Snay's theoretical results agree to within 20%. A time-constant-versus-range relation empirically fitted by Snay is valid only within a limited range at low pressures, whereas a time-constant formula based on Sedov's similarity solution holds at very high pressures. This leaves the intermediate pressure range with neither an empirical nor a theoretical formula for the time constant. These one-dimensional simulations demonstrate applicability of the computer code to investigations of this nature, and justify the use of this technique for more complex two-dimensional problems, namely, surface effects on underwater nuclear explosions. 16 refs., 8 figs., 2 tabs.

  20. Computational Challenges in Nuclear Weapons Simulation

    SciTech Connect

    McMillain, C F; Adams, T F; McCoy, M G; Christensen, R B; Pudliner, B S; Zika, M R; Brantley, P S; Vetter, J S; May, J M

    2003-08-29

    After a decade of experience, the Stockpile Stewardship Program continues to ensure the safety, security and reliability of the nation's nuclear weapons. The Advanced Simulation and Computing (ASCI) program was established to provide leading edge, high-end simulation capabilities needed to meet the program's assessment and certification requirements. The great challenge of this program lies in developing the tools and resources necessary for the complex, highly coupled, multi-physics calculations required to simulate nuclear weapons. This paper describes the hardware and software environment we have applied to fulfill our nuclear weapons responsibilities. It also presents the characteristics of our algorithms and codes, especially as they relate to supercomputing resource capabilities and requirements. It then addresses impediments to the development and application of nuclear weapon simulation software and hardware and concludes with a summary of observations and recommendations on an approach for working with industry and government agencies to address these impediments.

  1. Application of computer simulators in population genetics.

    PubMed

    Feng, Gao; Haipeng, Li

    2016-08-01

    The genomes of more and more organisms have been sequenced due to the advances in next-generation sequencing technologies. As a powerful tool, computer simulators play a critical role in studying the genome-wide DNA polymorphism pattern. Simulations can be performed both forwards-in-time and backwards-in-time, which complement each other and are suitable for meeting different needs, such as studying the effect of evolutionary dynamics, the estimation of parameters, and the validation of evolutionary hypotheses as well as new methods. In this review, we briefly introduced population genetics related theoretical framework and provided a detailed comparison of 32 simulators published over the last ten years. The future development of new simulators was also discussed. PMID:27531609

  2. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  3. Computational simulation of Faraday probe measurements

    NASA Astrophysics Data System (ADS)

    Boerner, Jeremiah J.

    Electric propulsion devices, including ion thrusters and Hall thrusters, are becoming increasingly popular for long duration space missions. Ground-based experimental testing of such devices is performed in vacuum chambers, which develop an unavoidable background gas due to pumping limitations and facility leakage. Besides directly altering the operating environment, the background gas may indirectly affect the performance of immersed plasma probe diagnostics. This work focuses on computational modeling research conducted to evaluate the performance of a current-collecting Faraday probe. Initial findings from one dimensional analytical models of plasma sheaths are used as reference cases for subsequent modeling. A two dimensional, axisymmetric, hybrid electron fluid and Particle In Cell computational code is used for extensive simulation of the plasma flow around a representative Faraday probe geometry. The hybrid fluid PIC code is used to simulate a range of inflowing plasma conditions, from a simple ion beam consistent with one dimensional models to a multiple component plasma representative of a low-power Hall thruster plume. These simulations produce profiles of plasma properties and simulated current measurements at the probe surface. Interpretation of the simulation results leads to recommendations for probe design and experimental techniques. Significant contributions of this work include the development and use of two new non-neutral detailed electron fluid models and the recent incorporation of multi grid capabilities.

  4. Brief introductory guide to agent-based modeling and an illustration from urban health research.

    PubMed

    Auchincloss, Amy H; Garcia, Leandro Martin Totaro

    2015-11-01

    There is growing interest among urban health researchers in addressing complex problems using conceptual and computation models from the field of complex systems. Agent-based modeling (ABM) is one computational modeling tool that has received a lot of interest. However, many researchers remain unfamiliar with developing and carrying out an ABM, hindering the understanding and application of it. This paper first presents a brief introductory guide to carrying out a simple agent-based model. Then, the method is illustrated by discussing a previously developed agent-based model, which explored inequalities in diet in the context of urban residential segregation. PMID:26648364

  5. Computer Simulation for Emergency Incident Management

    SciTech Connect

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident response and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.

  6. Memory interface simulator: A computer design aid

    NASA Technical Reports Server (NTRS)

    Taylor, D. S.; Williams, T.; Weatherbee, J. E.

    1972-01-01

    Results are presented of a study conducted with a digital simulation model being used in the design of the Automatically Reconfigurable Modular Multiprocessor System (ARMMS), a candidate computer system for future manned and unmanned space missions. The model simulates the activity involved as instructions are fetched from random access memory for execution in one of the system central processing units. A series of model runs measured instruction execution time under various assumptions pertaining to the CPU's and the interface between the CPU's and RAM. Design tradeoffs are presented in the following areas: Bus widths, CPU microprogram read only memory cycle time, multiple instruction fetch, and instruction mix.

  7. Computer Simulation of the VASIMR Engine

    NASA Technical Reports Server (NTRS)

    Garrison, David

    2005-01-01

    The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.

  8. Computer Simulation For Design Of TWT's

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1992-01-01

    A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.

  9. Integrated computer simulation on FIR FEL dynamics

    SciTech Connect

    Furukawa, H.; Kuruma, S.; Imasaki, K.

    1995-12-31

    An integrated computer simulation code has been developed to analyze the RF-Linac FEL dynamics. First, the simulation code on the electron beam acceleration and transport processes in RF-Linac: (LUNA) has been developed to analyze the characteristics of the electron beam in RF-Linac and to optimize the parameters of RF-Linac. Second, a space-time dependent 3D FEL simulation code (Shipout) has been developed. The RF-Linac FEL total simulations have been performed by using the electron beam data from LUNA in Shipout. The number of particles using in a RF-Linac FEL total simulation is approximately 1000. The CPU time for the simulation of 1 round trip is about 1.5 minutes. At ILT/ILE, Osaka, a 8.5MeV RF-Linac with a photo-cathode RF-gun is used for FEL oscillation experiments. By using 2 cm wiggler, the FEL oscillation in the wavelength approximately 46 {mu}m are investigated. By the simulations using LUNA with the parameters of an ILT/ILE experiment, the pulse shape and the energy spectra of the electron beam at the end of the linac are estimated. The pulse shape of the electron beam at the end of the linac has sharp rise-up and it slowly decays as a function of time. By the RF-linac FEL total simulations with the parameters of an ILT/ILE experiment, the dependencies of the start up of the FEL oscillations on the pulse shape of the electron beam at the end of the linac are estimated. The coherent spontaneous emission effects and the quick start up of FEL oscillations have been observed by the RF-Linac FEL total simulations.

  10. Computer simulations in the science classroom

    NASA Astrophysics Data System (ADS)

    Richards, John; Barowy, William; Levin, Dov

    1992-03-01

    In this paper we describe software for science instruction that is based upon a constructivist epistemology of learning. From a constructivist perspective, the process of learning is viewed as an active construction of knowledge, rather than a passive reception of information. The computer has the potential to provide an environment in which students can explore their understanding and better construct scientific knowledge. The Explorer is an interactive environment that integrates animated computer models with analytic capabilities for learning and teaching science. The system include graphs, a spreadsheet, scripting, and interactive tools. During formative evaluation of Explorer in the classroom, we have focused on learning the function and effectiveness of computer models in teaching science. Models have helped students relate theory to experiment when used in conjunction with hands-on activities and when the simulation addressed students' naive understanding of the phenomena. Two classroom examples illustrate our findings. The first is based on the dynamics of colliding objects. The second describes a class modeling the function of simple electric circuits. The simulations bridge between phenomena and theory by providing an abstract representation on which students may make measurements. Simulations based on scientific theory help to provide a set of interrelated experiences that challenge students' informal understanding of the science.

  11. Metal matrix composites microfracture: Computational simulation

    NASA Technical Reports Server (NTRS)

    Mital, Subodh K.; Caruso, John J.; Chamis, Christos C.

    1990-01-01

    Fiber/matrix fracture and fiber-matrix interface debonding in a metal matrix composite (MMC) are computationally simulated. These simulations are part of a research activity to develop computational methods for microfracture, microfracture propagation and fracture toughness of the metal matrix composites. The three-dimensional finite element model used in the simulation consists of a group of nine unidirectional fibers in three by three unit cell array of SiC/Ti15 metal matrix composite with a fiber volume ration of 0.35. This computational procedure is used to predict the fracture process and establish the hierarchy of fracture modes based on strain energy release rate. It is also used to predict stress redistribution to surrounding matrix-fibers due to initial and progressive fracture of fiber/matrix and due to debonding of fiber-matrix interface. Microfracture results for various loading cases such as longitudinal, transverse, shear and bending are presented and discussed. Step-by-step procedures are outlined to evaluate composite microfracture for a given composite system.

  12. Accelerating Climate Simulations Through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  13. Agent-based models of financial markets

    NASA Astrophysics Data System (ADS)

    Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T.

    2007-03-01

    This review deals with several microscopic ('agent-based') models of financial markets which have been studied by economists and physicists over the last decade: Kim-Markowitz, Levy-Levy-Solomon, Cont-Bouchaud, Solomon-Weisbuch, Lux-Marchesi, Donangelo-Sneppen and Solomon-Levy-Huang. After an overview of simulation approaches in financial economics, we first give a summary of the Donangelo-Sneppen model of monetary exchange and compare it with related models in economics literature. Our selective review then outlines the main ingredients of some influential early models of multi-agent dynamics in financial markets (Kim-Markowitz, Levy-Levy-Solomon). As will be seen, these contributions draw their inspiration from the complex appearance of investors' interactions in real-life markets. Their main aim is to reproduce (and, thereby, provide possible explanations) for the spectacular bubbles and crashes seen in certain historical episodes, but they lack (like almost all the work before 1998 or so) a perspective in terms of the universal statistical features of financial time series. In fact, awareness of a set of such regularities (power-law tails of the distribution of returns, temporal scaling of volatility) only gradually appeared over the nineties. With the more precise description of the formerly relatively vague characteristics (e.g. moving from the notion of fat tails to the more concrete one of a power law with index around three), it became clear that financial market dynamics give rise to some kind of universal scaling law. Showing similarities with scaling laws for other systems with many interacting sub-units, an exploration of financial markets as multi-agent systems appeared to be a natural consequence. This topic has been pursued by quite a number of contributions appearing in both the physics and economics literature since the late nineties. From the wealth of different flavours of multi-agent models that have appeared up to now, we discuss the Cont

  14. A computer simulation of chromosomal instability

    NASA Astrophysics Data System (ADS)

    Goodwin, E.; Cornforth, M.

    The transformation of a normal cell into a cancerous growth can be described as a process of mutation and selection occurring within the context of clonal expansion. Radiation, in addition to initial DNA damage, induces a persistent and still poorly understood genomic instability process that contributes to the mutational burden. It will be essential to include a quantitative description of this phenomenon in any attempt at science-based risk assessment. Monte Carlo computer simulations are a relatively simple way to model processes that are characterized by an element of randomness. A properly constructed simulation can capture the essence of a phenomenon that, as is often the case in biology, can be extraordinarily complex, and can do so even though the phenomenon itself is incompletely understood. A simple computer simulation of one manifestation of genomic instability known as chromosomal instability will be presented. The model simulates clonal expansion of a single chromosomally unstable cell into a colony. Instability is characterized by a single parameter, the rate of chromosomal rearrangement. With each new chromosome aberration, a unique subclone arises (subclones are defined as having a unique karyotype). The subclone initially has just one cell, but it can expand with cell division if the aberration is not lethal. The computer program automatically keeps track of the number of subclones within the expanding colony, and the number of cells within each subclone. Because chromosome aberrations kill some cells during colony growth, colonies arising from unstable cells tend to be smaller than those arising from stable cells. For any chosen level of instability, the computer program calculates the mean number of cells per colony averaged over many runs. These output should prove useful for investigating how such radiobiological phenomena as slow growth colonies, increased doubling time, and delayed cell death depend on chromosomal instability. Also of

  15. Agent based modeling of blood coagulation system: implementation using a GPU based high speed framework.

    PubMed

    Chen, Wenan; Ward, Kevin; Li, Qi; Kecman, Vojislav; Najarian, Kayvan; Menke, Nathan

    2011-01-01

    The coagulation and fibrinolytic systems are complex, inter-connected biological systems with major physiological roles. The complex, nonlinear multi-point relationships between the molecular and cellular constituents of two systems render a comprehensive and simultaneous study of the system at the microscopic and macroscopic level a significant challenge. We have created an Agent Based Modeling and Simulation (ABMS) approach for simulating these complex interactions. As the scale of agents increase, the time complexity and cost of the resulting simulations presents a significant challenge. As such, in this paper, we also present a high-speed framework for the coagulation simulation utilizing the computing power of graphics processing units (GPU). For comparison, we also implemented the simulations in NetLogo, Repast, and a direct C version. As our experiments demonstrate, the computational speed of the GPU implementation of the million-level scale of agents is over 10 times faster versus the C version, over 100 times faster versus the Repast version and over 300 times faster versus the NetLogo simulation. PMID:22254271

  16. New Computer Simulations of Macular Neural Functioning

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.; Doshay, D.; Linton, S.; Parnas, B.; Montgomery, K.; Chimento, T.

    1994-01-01

    We use high performance graphics workstations and supercomputers to study the functional significance of the three-dimensional (3-D) organization of gravity sensors. These sensors have a prototypic architecture foreshadowing more complex systems. Scaled-down simulations run on a Silicon Graphics workstation and scaled-up, 3-D versions run on a Cray Y-MP supercomputer. A semi-automated method of reconstruction of neural tissue from serial sections studied in a transmission electron microscope has been developed to eliminate tedious conventional photography. The reconstructions use a mesh as a step in generating a neural surface for visualization. Two meshes are required to model calyx surfaces. The meshes are connected and the resulting prisms represent the cytoplasm and the bounding membranes. A finite volume analysis method is employed to simulate voltage changes along the calyx in response to synapse activation on the calyx or on calyceal processes. The finite volume method insures that charge is conserved at the calyx-process junction. These and other models indicate that efferent processes act as voltage followers, and that the morphology of some afferent processes affects their functioning. In a final application, morphological information is symbolically represented in three dimensions in a computer. The possible functioning of the connectivities is tested using mathematical interpretations of physiological parameters taken from the literature. Symbolic, 3-D simulations are in progress to probe the functional significance of the connectivities. This research is expected to advance computer-based studies of macular functioning and of synaptic plasticity.

  17. Computational Methods for Jet Noise Simulation

    NASA Technical Reports Server (NTRS)

    Goodrich, John W. (Technical Monitor); Hagstrom, Thomas

    2003-01-01

    The purpose of our project is to develop, analyze, and test novel numerical technologies central to the long term goal of direct simulations of subsonic jet noise. Our current focus is on two issues: accurate, near-field domain truncations and high-order, single-step discretizations of the governing equations. The Direct Numerical Simulation (DNS) of jet noise poses a number of extreme challenges to computational technique. In particular, the problem involves multiple temporal and spatial scales as well as flow instabilities and is posed on an unbounded spatial domain. Moreover, the basic phenomenon of interest, the radiation of acoustic waves to the far field, involves only a minuscule fraction of the total energy. The best current simulations of jet noise are at low Reynolds number. It is likely that an increase of one to two orders of magnitude will be necessary to reach a regime where the separation between the energy-containing and dissipation scales is sufficient to make the radiated noise essentially independent of the Reynolds number. Such an increase in resolution cannot be obtained in the near future solely through increases in computing power. Therefore, new numerical methodologies of maximal efficiency and accuracy are required.

  18. Trends in Social Science: The Impact of Computational and Simulative Models

    NASA Astrophysics Data System (ADS)

    Conte, Rosaria; Paolucci, Mario; Cecconi, Federico

    This paper discusses current progress in the computational social sciences. Specifically, it examines the following questions: Are the computational social sciences exhibiting positive or negative developments? What are the roles of agent-based models and simulation (ABM), network analysis, and other "computational" methods within this dynamic? (Conte, The necessity of intelligent agents in social simulation, Advances in Complex Systems, 3(01n04), 19-38, 2000; Conte 2010; Macy, Annual Review of Sociology, 143-166, 2002). Are there objective indicators of scientific growth that can be applied to different scientific areas, allowing for comparison among them? In this paper, some answers to these questions are presented and discussed. In particular, comparisons among different disciplines in the social and computational sciences are shown, taking into account their respective growth trends in the number of publication citations over the last few decades (culled from Google Scholar). After a short discussion of the methodology adopted, results of keyword-based queries are presented, unveiling some unexpected local impacts of simulation on the takeoff of traditionally poorly productive disciplines.

  19. Parallel Proximity Detection for Computer Simulations

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1998-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are included by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  20. Fast computation algorithms for speckle pattern simulation

    SciTech Connect

    Nascov, Victor; Samoilă, Cornel; Ursuţiu, Doru

    2013-11-13

    We present our development of a series of efficient computation algorithms, generally usable to calculate light diffraction and particularly for speckle pattern simulation. We use mainly the scalar diffraction theory in the form of Rayleigh-Sommerfeld diffraction formula and its Fresnel approximation. Our algorithms are based on a special form of the convolution theorem and the Fast Fourier Transform. They are able to evaluate the diffraction formula much faster than by direct computation and we have circumvented the restrictions regarding the relative sizes of the input and output domains, met on commonly used procedures. Moreover, the input and output planes can be tilted each to other and the output domain can be off-axis shifted.

  1. Investigation of Carbohydrate Recognition via Computer Simulation

    SciTech Connect

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We review the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.

  2. Parallel Proximity Detection for Computer Simulation

    NASA Technical Reports Server (NTRS)

    Steinman, Jeffrey S. (Inventor); Wieland, Frederick P. (Inventor)

    1997-01-01

    The present invention discloses a system for performing proximity detection in computer simulations on parallel processing architectures utilizing a distribution list which includes movers and sensor coverages which check in and out of grids. Each mover maintains a list of sensors that detect the mover's motion as the mover and sensor coverages check in and out of the grids. Fuzzy grids are includes by fuzzy resolution parameters to allow movers and sensor coverages to check in and out of grids without computing exact grid crossings. The movers check in and out of grids while moving sensors periodically inform the grids of their coverage. In addition, a lookahead function is also included for providing a generalized capability without making any limiting assumptions about the particular application to which it is applied. The lookahead function is initiated so that risk-free synchronization strategies never roll back grid events. The lookahead function adds fixed delays as events are scheduled for objects on other nodes.

  3. Computer simulation of spacecraft/environment interaction.

    PubMed

    Krupnikov, K K; Makletsov, A A; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-10-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991 1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language. PMID:11542669

  4. Investigation of Carbohydrate Recognition via Computer Simulation

    DOE PAGESBeta

    Johnson, Quentin R.; Lindsay, Richard J.; Petridis, Loukas; Shen, Tongye

    2015-04-28

    Carbohydrate recognition by proteins, such as lectins and other (bio)molecules, can be essential for many biological functions. Interest has arisen due to potential protein and drug design and future bioengineering applications. A quantitative measurement of carbohydrate-protein interaction is thus important for the full characterization of sugar recognition. Here, we focus on the aspect of utilizing computer simulations and biophysical models to evaluate the strength and specificity of carbohydrate recognition in this review. With increasing computational resources, better algorithms and refined modeling parameters, using state-of-the-art supercomputers to calculate the strength of the interaction between molecules has become increasingly mainstream. We reviewmore » the current state of this technique and its successful applications for studying protein-sugar interactions in recent years.« less

  5. Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis

    PubMed Central

    2016-01-01

    Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website. PMID:27340402

  6. Agent-Based Deterministic Modeling of the Bone Marrow Homeostasis.

    PubMed

    Kurhekar, Manish; Deshpande, Umesh

    2016-01-01

    Modeling of stem cells not only describes but also predicts how a stem cell's environment can control its fate. The first stem cell populations discovered were hematopoietic stem cells (HSCs). In this paper, we present a deterministic model of bone marrow (that hosts HSCs) that is consistent with several of the qualitative biological observations. This model incorporates stem cell death (apoptosis) after a certain number of cell divisions and also demonstrates that a single HSC can potentially populate the entire bone marrow. It also demonstrates that there is a production of sufficient number of differentiated cells (RBCs, WBCs, etc.). We prove that our model of bone marrow is biologically consistent and it overcomes the biological feasibility limitations of previously reported models. The major contribution of our model is the flexibility it allows in choosing model parameters which permits several different simulations to be carried out in silico without affecting the homeostatic properties of the model. We have also performed agent-based simulation of the model of bone marrow system proposed in this paper. We have also included parameter details and the results obtained from the simulation. The program of the agent-based simulation of the proposed model is made available on a publicly accessible website. PMID:27340402

  7. Molecular physiology of rhodopsin: Computer simulation

    NASA Astrophysics Data System (ADS)

    Fel'Dman, T. B.; Kholmurodov, Kh. T.; Ostrovsky, M. A.

    2008-03-01

    Computer simulation is used for comparative investigation of the molecular dynamics of rhodopsin containing the chromophore group (11- cis-retinal) and free opsin. Molecular dynamics is traced within a time interval of 3000 ps; 3 × 106 discrete conformational states of rhodopsin and opsin are obtained and analyzed. It is demonstrated that the presence of the chromophore group in the chromophore center of opsin influences considerably the nearest protein environment of 11- cis-retinal both in the region of the β-ionone ring and in the region of the protonated Schiff base bond. Based on simulation results, a possible intramolecular mechanism of keeping rhodopsin as a G-protein-coupled receptor in the inactive state, i.e., the chromophore function as an efficient ligand antagonist, is discussed.

  8. Computer Simulation Studies of Gramicidin Channel

    NASA Astrophysics Data System (ADS)

    Song, Hyundeok; Beck, Thomas

    2009-04-01

    Ion channels are large membrane proteins, and their function is to facilitate the passage of ions across biological membranes. Recently, Dr. John Cuppoletti's group at UC showed that the gramicidin channel could function at high temperatures (360 -- 390K) with significant currents. This finding may have large implications for fuel cell technology. In this project, we will examine the experimental system by computer simulation. We will investigate how the temperature affects the current and differences in magnitude of the currents between two forms of Gramicidin, A and D. This research will help to elucidate the underlying molecular mechanism in this promising new technology.

  9. Multiscale agent-based consumer market modeling.

    SciTech Connect

    North, M. J.; Macal, C. M.; St. Aubin, J.; Thimmapuram, P.; Bragen, M.; Hahn, J.; Karr, J.; Brigham, N.; Lacy, M. E.; Hampton, D.; Decision and Information Sciences; Procter & Gamble Co.

    2010-05-01

    Consumer markets have been studied in great depth, and many techniques have been used to represent them. These have included regression-based models, logit models, and theoretical market-level models, such as the NBD-Dirichlet approach. Although many important contributions and insights have resulted from studies that relied on these models, there is still a need for a model that could more holistically represent the interdependencies of the decisions made by consumers, retailers, and manufacturers. When the need is for a model that could be used repeatedly over time to support decisions in an industrial setting, it is particularly critical. Although some existing methods can, in principle, represent such complex interdependencies, their capabilities might be outstripped if they had to be used for industrial applications, because of the details this type of modeling requires. However, a complementary method - agent-based modeling - shows promise for addressing these issues. Agent-based models use business-driven rules for individuals (e.g., individual consumer rules for buying items, individual retailer rules for stocking items, or individual firm rules for advertizing items) to determine holistic, system-level outcomes (e.g., to determine if brand X's market share is increasing). We applied agent-based modeling to develop a multi-scale consumer market model. We then conducted calibration, verification, and validation tests of this model. The model was successfully applied by Procter & Gamble to several challenging business problems. In these situations, it directly influenced managerial decision making and produced substantial cost savings.

  10. Assurance in Agent-Based Systems

    SciTech Connect

    Gilliom, Laura R.; Goldsmith, Steven Y.

    1999-05-10

    Our vision of the future of information systems is one that includes engineered collectives of software agents which are situated in an environment over years and which increasingly improve the performance of the overall system of which they are a part. At a minimum, the movement of agent and multi-agent technology into National Security applications, including their use in information assurance, is apparent today. The use of deliberative, autonomous agents in high-consequence/high-security applications will require a commensurate level of protection and confidence in the predictability of system-level behavior. At Sandia National Laboratories, we have defined and are addressing a research agenda that integrates the surety (safety, security, and reliability) into agent-based systems at a deep level. Surety is addressed at multiple levels: The integrity of individual agents must be protected by addressing potential failure modes and vulnerabilities to malevolent threats. Providing for the surety of the collective requires attention to communications surety issues and mechanisms for identifying and working with trusted collaborators. At the highest level, using agent-based collectives within a large-scale distributed system requires the development of principled design methods to deliver the desired emergent performance or surety characteristics. This position paper will outline the research directions underway at Sandia, will discuss relevant work being performed elsewhere, and will report progress to date toward assurance in agent-based systems.

  11. Determining Peptide Partitioning Properties via Computer Simulation

    PubMed Central

    Ulmschneider, Jakob P.; Ulmschneider, Martin B.

    2010-01-01

    The transfer of polypeptide segments into lipid bilayers to form transmembrane helices represents the crucial first step in cellular membrane protein folding and assembly. This process is driven by complex and poorly understood atomic interactions of peptides with the lipid bilayer environment. The lack of suitable experimental techniques that can resolve these processes both at atomic resolution and nanosecond timescales has spurred the development of computational techniques. In this review, we summarize the significant progress achieved in the last few years in elucidating the partitioning of peptides into lipid bilayer membranes using atomic detail molecular dynamics simulations. Indeed, partitioning simulations can now provide a wealth of structural and dynamic information. Furthermore, we show that peptide-induced bilayer distortions, insertion pathways, transfer free energies, and kinetic insertion barriers are now accurate enough to complement experiments. Further advances in simulation methods and force field parameter accuracy promise to turn molecular dynamics simulations into a powerful tool for investigating a wide range of membrane active peptide phenomena. PMID:21107546

  12. Multiscale Computer Simulation of Failure in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2008-01-01

    Aerogels have been of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While such gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. We have previously performed computer simulations of aerogel thermal conductivity and tensile and compressive failure, with results that are in qualitative, and sometimes quantitative, agreement with experiment. However, recent experiments in our laboratory suggest that gels having similar densities may exhibit substantially different properties. In this work, we extend our original diffusion limited cluster aggregation (DLCA) model for gel structure to incorporate additional variation in DLCA simulation parameters, with the aim of producing DLCA clusters of similar densities that nevertheless have different fractal dimension and secondary particle coordination. We perform particle statics simulations of gel strain on these clusters, and consider the effects of differing DLCA simulation conditions, and the resultant differences in fractal dimension and coordination, on gel strain properties.

  13. Computer Simulation of Fracture in Aerogels

    NASA Technical Reports Server (NTRS)

    Good, Brian S.

    2006-01-01

    Aerogels are of interest to the aerospace community primarily for their thermal properties, notably their low thermal conductivities. While the gels are typically fragile, recent advances in the application of conformal polymer layers to these gels has made them potentially useful as lightweight structural materials as well. In this work, we investigate the strength and fracture behavior of silica aerogels using a molecular statics-based computer simulation technique. The gels' structure is simulated via a Diffusion Limited Cluster Aggregation (DLCA) algorithm, which produces fractal structures representing experimentally observed aggregates of so-called secondary particles, themselves composed of amorphous silica primary particles an order of magnitude smaller. We have performed multi-length-scale simulations of fracture in silica aerogels, in which the interaction b e e n two secondary particles is assumed to be described by a Morse pair potential parameterized such that the potential range is much smaller than the secondary particle size. These Morse parameters are obtained by atomistic simulation of models of the experimentally-observed amorphous silica "bridges," with the fracture behavior of these bridges modeled via molecular statics using a Morse/Coulomb potential for silica. We consider the energetics of the fracture, and compare qualitative features of low-and high-density gel fracture.

  14. Computational model for protein unfolding simulation

    NASA Astrophysics Data System (ADS)

    Tian, Xu-Hong; Zheng, Ye-Han; Jiao, Xiong; Liu, Cai-Xing; Chang, Shan

    2011-06-01

    The protein folding problem is one of the fundamental and important questions in molecular biology. However, the all-atom molecular dynamics studies of protein folding and unfolding are still computationally expensive and severely limited by the time scale of simulation. In this paper, a simple and fast protein unfolding method is proposed based on the conformational stability analyses and structure modeling. In this method, two structure-based conditions are considered to identify the unstable regions of proteins during the unfolding processes. The protein unfolding trajectories are mimicked through iterative structure modeling according to conformational stability analyses. Two proteins, chymotrypsin inhibitor 2 (CI2) and α -spectrin SH3 domain (SH3) were simulated by this method. Their unfolding pathways are consistent with the previous molecular dynamics simulations. Furthermore, the transition states of the two proteins were identified in unfolding processes and the theoretical Φ values of these transition states showed significant correlations with the experimental data (the correlation coefficients are >0.8). The results indicate that this method is effective in studying protein unfolding. Moreover, we analyzed and discussed the influence of parameters on the unfolding simulation. This simple coarse-grained model may provide a general and fast approach for the mechanism studies of protein folding.

  15. Computer simulation of metal-organic materials

    NASA Astrophysics Data System (ADS)

    Stern, Abraham C.

    Computer simulations of metal-organic frameworks are conducted to both investigate the mechanism of hydrogen sorption and to elucidate a detailed, molecular-level understanding of the physical interactions that can lead to successful material design strategies. To this end, important intermolecular interactions are identified and individually parameterized to yield a highly accurate representation of the potential energy landscape. Polarization, one such interaction found to play a significant role in H 2 sorption, is included explicitly for the first time in simulations of metal-organic frameworks. Permanent electrostatics are usually accounted for by means of an approximate fit to model compounds. The application of this method to simulations involving metal-organic frameworks introduces several substantial problems that are characterized in this work. To circumvent this, a method is developed and tested in which atomic point partial charges are computed more directly, fit to the fully periodic electrostatic potential. In this manner, long-range electrostatics are explicitly accounted for via Ewald summation. Grand canonical Monte Carlo simulations are conducted employing the force field parameterization developed here. Several of the major findings of this work are: Polarization is found to play a critical role in determining the overall structure of H2 sorbed in metal-organic frameworks, although not always the determining factor in uptake. The parameterization of atomic point charges by means of a fit to the periodic electrostatic potential is a robust, efficient method and consistently results in a reliable description of Coulombic interactions without introducing ambiguity associated with other procedures. After careful development of both hydrogen and framework potential energy functions, quantitatively accurate results have been obtained. Such predictive accuracy will aid greatly in the rational, iterative design cycle between experimental and theoretical

  16. Computer simulation of solder joint failure

    SciTech Connect

    Burchett, S.N.; Frear, D.R.; Rashid, M.M.

    1997-04-01

    The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide the fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.

  17. A Hybrid Sensitivity Analysis Approach for Agent-based Disease Spread Models

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. Of particular interest lately is the application of agent-based and hybrid models to epidemiology, specifically Agent-based Disease Spread Models (ABDSM). Validation (one aspect of the means to achieve dependability) of ABDSM simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. In this report, we describe our preliminary efforts in ABDSM validation by using hybrid model fusion technology.

  18. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    NASA Astrophysics Data System (ADS)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  19. Computer Simulations in Science Education: Implications for Distance Education

    ERIC Educational Resources Information Center

    Sahin, Sami

    2006-01-01

    This paper is a review of literature about the use of computer simulations in science education. This review examines types and examples of computer simulations. The literature review indicated that although computer simulations cannot replace science classroom and laboratory activities completely, they offer various advantages both for classroom…

  20. The Learning Effects of Computer Simulations in Science Education

    ERIC Educational Resources Information Center

    Rutten, Nico; van Joolingen, Wouter R.; van der Veen, Jan T.

    2012-01-01

    This article reviews the (quasi)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on…

  1. Agent-based model of macrophage action on endocrine pancreas.

    PubMed

    Martínez, Ignacio V; Gómez, Enrique J; Hernando, M Elena; Villares, Ricardo; Mellado, Mario

    2012-01-01

    This paper proposes an agent-based model of the action of macrophages on the beta cells of the endocrine pancreas. The aim of this model is to simulate the processes of beta cell proliferation and apoptosis and also the process of phagocytosis of cell debris by macrophages, all of which are related to the onset of the autoimmune response in type 1 diabetes. We have used data from the scientific literature to design the model. The results show that the model obtains good approximations to real processes and could be used to shed light on some open questions concerning such processes. PMID:23155767

  2. An Agent Based Model for Social Class Emergence

    NASA Astrophysics Data System (ADS)

    Yang, Xiaoxiang; Rodriguez Segura, Daniel; Lin, Fei; Mazilu, Irina

    We present an open system agent-based model to analyze the effects of education and the society-specific wealth transactions on the emergence of social classes. Building on previous studies, we use realistic functions to model how years of education affect the income level. Numerical simulations show that the fraction of an individual's total transactions that is invested rather than consumed can cause wealth gaps between different income brackets in the long run. In an attempt to incorporate the network effects, we also explore how the probability of interactions among agents depending on the spread of their income brackets affects wealth distribution.

  3. A Computational Framework for Bioimaging Simulation

    PubMed Central

    Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi

    2015-01-01

    Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508

  4. Neural network computer simulation of medical aerosols.

    PubMed

    Richardson, C J; Barlow, D J

    1996-06-01

    Preliminary investigations have been conducted to assess the potential for using artificial neural networks to simulate aerosol behaviour, with a view to employing this type of methodology in the evaluation and design of pulmonary drug-delivery systems. Details are presented of the general purpose software developed for these tasks; it implements a feed-forward back-propagation algorithm with weight decay and connection pruning, the user having complete run-time control of the network architecture and mode of training. A series of exploratory investigations is then reported in which different network structures and training strategies are assessed in terms of their ability to simulate known patterns of fluid flow in simple model systems. The first of these involves simulations of cellular automata-generated data for fluid flow through a partially obstructed two-dimensional pipe. The artificial neural networks are shown to be highly successful in simulating the behaviour of this simple linear system, but with important provisos relating to the information content of the training data and the criteria used to judge when the network is properly trained. A second set of investigations is then reported in which similar networks are used to simulate patterns of fluid flow through aerosol generation devices, using training data furnished through rigorous computational fluid dynamics modelling. These more complex three-dimensional systems are modelled with equal success. It is concluded that carefully tailored, well trained networks could provide valuable tools not just for predicting but also for analysing the spatial dynamics of pharmaceutical aerosols. PMID:8832491

  5. Computational simulation for concurrent engineering of aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1993-01-01

    Results are summarized for an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulation methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties--fundamental to develop such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering of propulsion systems and systems in general. Benefits and issues needing early attention in the development are outlined.

  6. Computational simulation of concurrent engineering for aerospace propulsion systems

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Singhal, S. N.

    1992-01-01

    Results are summarized of an investigation to assess the infrastructure available and the technology readiness in order to develop computational simulation methods/software for concurrent engineering. These results demonstrate that development of computational simulations methods for concurrent engineering is timely. Extensive infrastructure, in terms of multi-discipline simulation, component-specific simulation, system simulators, fabrication process simulation, and simulation of uncertainties - fundamental in developing such methods, is available. An approach is recommended which can be used to develop computational simulation methods for concurrent engineering for propulsion systems and systems in general. Benefits and facets needing early attention in the development are outlined.

  7. NISAC Agent Based Laboratory for Economics

    SciTech Connect

    Downes, Paula; Davis, Chris; Eidson, Eric; Ehlen, Mark; Gieseler, Charles; Harris, Richard

    2006-10-11

    The software provides large-scale microeconomic simulation of complex economic and social systems (such as supply chain and market dynamics of businesses in the US economy) and their dependence on physical infrastructure systems. The system is based on Agent simulation, where each entity of inteest in the system to be modeled (for example, a Bank, individual firms, Consumer households, etc.) is specified in a data-driven sense to be individually repreented by an Agent. The Agents interact using rules of interaction appropriate to their roles, and through those interactions complex economic and social dynamics emerge. The software is implemented in three tiers, a Java-based visualization client, a C++ control mid-tier, and a C++ computational tier.

  8. NISAC Agent Based Laboratory for Economics

    2006-10-11

    The software provides large-scale microeconomic simulation of complex economic and social systems (such as supply chain and market dynamics of businesses in the US economy) and their dependence on physical infrastructure systems. The system is based on Agent simulation, where each entity of inteest in the system to be modeled (for example, a Bank, individual firms, Consumer households, etc.) is specified in a data-driven sense to be individually repreented by an Agent. The Agents interactmore » using rules of interaction appropriate to their roles, and through those interactions complex economic and social dynamics emerge. The software is implemented in three tiers, a Java-based visualization client, a C++ control mid-tier, and a C++ computational tier.« less

  9. An agent-based multilayer architecture for bioinformatics grids.

    PubMed

    Bartocci, Ezio; Cacciagrano, Diletta; Cannata, Nicola; Corradini, Flavio; Merelli, Emanuela; Milanesi, Luciano; Romano, Paolo

    2007-06-01

    Due to the huge volume and complexity of biological data available today, a fundamental component of biomedical research is now in silico analysis. This includes modelling and simulation of biological systems and processes, as well as automated bioinformatics analysis of high-throughput data. The quest for bioinformatics resources (including databases, tools, and knowledge) becomes therefore of extreme importance. Bioinformatics itself is in rapid evolution and dedicated Grid cyberinfrastructures already offer easier access and sharing of resources. Furthermore, the concept of the Grid is progressively interleaving with those of Web Services, semantics, and software agents. Agent-based systems can play a key role in learning, planning, interaction, and coordination. Agents constitute also a natural paradigm to engineer simulations of complex systems like the molecular ones. We present here an agent-based, multilayer architecture for bioinformatics Grids. It is intended to support both the execution of complex in silico experiments and the simulation of biological systems. In the architecture a pivotal role is assigned to an "alive" semantic index of resources, which is also expected to facilitate users' awareness of the bioinformatics domain. PMID:17695749

  10. From Compartmentalized to Agent-based Models of Epidemics

    NASA Astrophysics Data System (ADS)

    Macal, Charles

    Supporting decisions in the throes of an impending epidemic poses distinct technical challenges arising from the uncertainties in modeling disease propagation processes and the need for producing timely answers to policy questions. Compartmental models, because of their relative simplicity, produce timely information, but often do not include the level of fidelity of the information needed to answer specific policy questions. Highly granular agent-based simulations produce an extensive amount of information on all aspects of a simulated epidemic, yet complex models often cannot produce this information in a timely manner. We propose a two-phased approach to addressing the tradeoff between model complexity and the speed at which models can be used to answer to questions about an impending outbreak. In the first phase, in advance of an epidemic, ensembles of highly granular agent-based simulations are run over the entire parameter space, characterizing the space of possible model outcomes and uncertainties. Meta-models are derived that characterize model outcomes as dependent on uncertainties in disease parameters, data, and structural relationships. In the second phase, envisioned as during an epidemic, the meta-model is run in combination with compartmental models, which can be run very quickly. Model outcomes are compared as a basis for establishing uncertainties in model forecasts. This work is supported by the U.S. Department of Energy under Contract number DE-AC02-06CH11357 and National Science Foundation (NSF) RAPID Award DEB-1516428.

  11. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.; Halicioglu, M. T.

    1984-01-01

    All the investigations which were performed employed in one way or another a computer simulation technique based on atomistic level considerations. In general, three types of simulation methods were used for modeling systems with discrete particles that interact via well defined potential functions: molecular dynamics (a general method for solving the classical equations of motion of a model system); Monte Carlo (the use of Markov chain ensemble averaging technique to model equilibrium properties of a system); and molecular statics (provides properties of a system at T = 0 K). The effects of three-body forces on the vibrational frequencies of triatomic cluster were investigated. The multilayer relaxation phenomena for low index planes of an fcc crystal was analyzed also as a function of the three-body interactions. Various surface properties for Si and SiC system were calculated. Results obtained from static simulation calculations for slip formation were presented. The more elaborate molecular dynamics calculations on the propagation of cracks in two-dimensional systems were outlined.

  12. Computer simulation of fatigue under diametrical compression

    SciTech Connect

    Carmona, H. A.; Kun, F.; Andrade, J. S. Jr.; Herrmann, H. J.

    2007-04-15

    We study the fatigue fracture of disordered materials by means of computer simulations of a discrete element model. We extend a two-dimensional fracture model to capture the microscopic mechanisms relevant for fatigue and we simulate the diametric compression of a disc shape specimen under a constant external force. The model allows us to follow the development of the fracture process on the macrolevel and microlevel varying the relative influence of the mechanisms of damage accumulation over the load history and healing of microcracks. As a specific example we consider recent experimental results on the fatigue fracture of asphalt. Our numerical simulations show that for intermediate applied loads the lifetime of the specimen presents a power law behavior. Under the effect of healing, more prominent for small loads compared to the tensile strength of the material, the lifetime of the sample increases and a fatigue limit emerges below which no macroscopic failure occurs. The numerical results are in a good qualitative agreement with the experimental findings.

  13. Techniques and Issues in Agent-Based Modeling Validation

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui

    2012-01-01

    Validation of simulation models is extremely important. It ensures that the right model has been built and lends confidence to the use of that model to inform critical decisions. Agent-based models (ABM) have been widely deployed in different fields for studying the collective behavior of large numbers of interacting agents. However, researchers have only recently started to consider the issues of validation. Compared to other simulation models, ABM has many differences in model development, usage and validation. An ABM is inherently easier to build than a classical simulation, but more difficult to describe formally since they are closer to human cognition. Using multi-agent models to study complex systems has attracted criticisms because of the challenges involved in their validation [1]. In this report, we describe the challenge of ABM validation and present a novel approach we recently developed for an ABM system.

  14. Chip level simulation of fault tolerant computers

    NASA Technical Reports Server (NTRS)

    Armstrong, J. R.

    1983-01-01

    Chip level modeling techniques, functional fault simulation, simulation software development, a more efficient, high level version of GSP, and a parallel architecture for functional simulation are discussed.

  15. Space radiator simulation manual for computer code

    NASA Technical Reports Server (NTRS)

    Black, W. Z.; Wulff, W.

    1972-01-01

    A computer program that simulates the performance of a space radiator is presented. The program basically consists of a rigorous analysis which analyzes a symmetrical fin panel and an approximate analysis that predicts system characteristics for cases of non-symmetrical operation. The rigorous analysis accounts for both transient and steady state performance including aerodynamic and radiant heating of the radiator system. The approximate analysis considers only steady state operation with no aerodynamic heating. A description of the radiator system and instructions to the user for program operation is included. The input required for the execution of all program options is described. Several examples of program output are contained in this section. Sample output includes the radiator performance during ascent, reentry and orbit.

  16. Computational simulation methods for composite fracture mechanics

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1988-01-01

    Structural integrity, durability, and damage tolerance of advanced composites are assessed by studying damage initiation at various scales (micro, macro, and global) and accumulation and growth leading to global failure, quantitatively and qualitatively. In addition, various fracture toughness parameters associated with a typical damage and its growth must be determined. Computational structural analysis codes to aid the composite design engineer in performing these tasks were developed. CODSTRAN (COmposite Durability STRuctural ANalysis) is used to qualitatively and quantitatively assess the progressive damage occurring in composite structures due to mechanical and environmental loads. Next, methods are covered that are currently being developed and used at Lewis to predict interlaminar fracture toughness and related parameters of fiber composites given a prescribed damage. The general purpose finite element code MSC/NASTRAN was used to simulate the interlaminar fracture and the associated individual as well as mixed-mode strain energy release rates in fiber composites.

  17. Computational simulation of hot composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.; Singhal, S. N.

    1991-01-01

    Three different computer codes developed in-house are described for application to hot composite structures. These codes include capabilities for: (1) laminate behavior (METCAN); (2) thermal/structural analysis of hot structures made from high temperature metal matrix composites (HITCAN); and (3) laminate tailoring (MMLT). Results for select sample cases are described to demonstrate the versatility as well as the application of these codes to specific situations. The sample case results show that METCAN can be used to simulate cyclic life in high temperature metal matrix composites; HITCAN can be used to evaluate the structural performance of curved panels as well as respective sensitivities of various nonlinearities, and MMLT can be used to tailor the fabrication process in order to reduce residual stresses in the matrix upon cool-down.

  18. Computational simulation of hot composites structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Murthy, P. L. N.; Singhal, S. N.

    1991-01-01

    Three different computer codes developed in-house are described for application to hot composite structures. These codes include capabilities for: (1) laminate behavior (METCAN); (2) thermal/structural analysis of hot structures made from high temperature metal matrix composites (HITCAN); and (3) laminate tailoring (MMLT). Results for select sample cases are described to demonstrate the versatility as well as the application of these codes to specific situations. The sample case results show that METCAN can be used to simulate cyclic life in high temperature metal matrix composites; HITCAN can be used to evaluate the structural performance of curved panels as well as respective sensitivities of various nonlinearities, and MMLT can be used to tailor the fabrication process in order to reduce residual stresses in the matrix upon cool-down.

  19. Miller experiments in atomistic computer simulations

    PubMed Central

    Saitta, Antonino Marco; Saija, Franz

    2014-01-01

    The celebrated Miller experiments reported on the spontaneous formation of amino acids from a mixture of simple molecules reacting under an electric discharge, giving birth to the research field of prebiotic chemistry. However, the chemical reactions involved in those experiments have never been studied at the atomic level. Here we report on, to our knowledge, the first ab initio computer simulations of Miller-like experiments in the condensed phase. Our study, based on the recent method of treatment of aqueous systems under electric fields and on metadynamics analysis of chemical reactions, shows that glycine spontaneously forms from mixtures of simple molecules once an electric field is switched on and identifies formic acid and formamide as key intermediate products of the early steps of the Miller reactions, and the crucible of formation of complex biological molecules. PMID:25201948

  20. A Mass Spectrometer Simulator in Your Computer

    NASA Astrophysics Data System (ADS)

    Gagnon, Michel

    2012-12-01

    Introduced to study components of ionized gas, the mass spectrometer has evolved into a highly accurate device now used in many undergraduate and research laboratories. Unfortunately, despite their importance in the formation of future scientists, mass spectrometers remain beyond the financial reach of many high schools and colleges. As a result, it is not possible for instructors to take full advantage of this equipment. Therefore, to facilitate accessibility to this tool, we have developed a realistic computer-based simulator. Using this software, students are able to practice their ability to identify the components of the original gas, thereby gaining a better understanding of the underlying physical laws. The software is available as a free download.

  1. Experiential Learning through Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Maynes, Bill; And Others

    1992-01-01

    Describes experiential learning instructional model and simulation for student principals. Describes interactive laser videodisc simulation. Reports preliminary findings about student principal learning from simulation. Examines learning approaches by unsuccessful and successful students and learning levels of model learners. Simulation's success…

  2. Engineering Fracking Fluids with Computer Simulation

    NASA Astrophysics Data System (ADS)

    Shaqfeh, Eric

    2015-11-01

    There are no comprehensive simulation-based tools for engineering the flows of viscoelastic fluid-particle suspensions in fully three-dimensional geometries. On the other hand, the need for such a tool in engineering applications is immense. Suspensions of rigid particles in viscoelastic fluids play key roles in many energy applications. For example, in oil drilling the ``drilling mud'' is a very viscous, viscoelastic fluid designed to shear-thin during drilling, but thicken at stoppage so that the ``cuttings'' can remain suspended. In a related application known as hydraulic fracturing suspensions of solids called ``proppant'' are used to prop open the fracture by pumping them into the well. It is well-known that particle flow and settling in a viscoelastic fluid can be quite different from that which is observed in Newtonian fluids. First, it is now well known that the ``fluid particle split'' at bifurcation cracks is controlled by fluid rheology in a manner that is not understood. Second, in Newtonian fluids, the presence of an imposed shear flow in the direction perpendicular to gravity (which we term a cross or orthogonal shear flow) has no effect on the settling of a spherical particle in Stokes flow (i.e. at vanishingly small Reynolds number). By contrast, in a non-Newtonian liquid, the complex rheological properties induce a nonlinear coupling between the sedimentation and shear flow. Recent experimental data have shown both the shear thinning and the elasticity of the suspending polymeric solutions significantly affects the fluid-particle split at bifurcations, as well as the settling rate of the solids. In the present work, we use the Immersed Boundary Method to develop computer simulations of viscoelastic flow in suspensions of spheres to study these problems. These simulations allow us to understand the detailed physical mechanisms for the remarkable physical behavior seen in practice, and actually suggest design rules for creating new fluid recipes.

  3. Proceedings 3rd NASA/IEEE Workshop on Formal Approaches to Agent-Based Systems (FAABS-III)

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael (Editor); Rash, James (Editor); Truszkowski, Walt (Editor); Rouff, Christopher (Editor)

    2004-01-01

    These preceedings contain 18 papers and 4 poster presentation, covering topics such as: multi-agent systems, agent-based control, formalism, norms, as well as physical and biological models of agent-based systems. Some applications presented in the proceedings include systems analysis, software engineering, computer networks and robot control.

  4. Duality quantum computer and the efficient quantum simulations

    NASA Astrophysics Data System (ADS)

    Wei, Shi-Jie; Long, Gui-Lu

    2016-03-01

    Duality quantum computing is a new mode of a quantum computer to simulate a moving quantum computer passing through a multi-slit. It exploits the particle wave duality property for computing. A quantum computer with n qubits and a qudit simulates a moving quantum computer with n qubits passing through a d-slit. Duality quantum computing can realize an arbitrary sum of unitaries and therefore a general quantum operator, which is called a generalized quantum gate. All linear bounded operators can be realized by the generalized quantum gates, and unitary operators are just the extreme points of the set of generalized quantum gates. Duality quantum computing provides flexibility and a clear physical picture in designing quantum algorithms, and serves as a powerful bridge between quantum and classical algorithms. In this paper, after a brief review of the theory of duality quantum computing, we will concentrate on the applications of duality quantum computing in simulations of Hamiltonian systems. We will show that duality quantum computing can efficiently simulate quantum systems by providing descriptions of the recent efficient quantum simulation algorithm of Childs and Wiebe (Quantum Inf Comput 12(11-12):901-924, 2012) for the fast simulation of quantum systems with a sparse Hamiltonian, and the quantum simulation algorithm by Berry et al. (Phys Rev Lett 114:090502, 2015), which provides exponential improvement in precision for simulating systems with a sparse Hamiltonian.

  5. Using Computational Simulations to Confront Students' Mental Models

    ERIC Educational Resources Information Center

    Rodrigues, R.; Carvalho, P. Simeão

    2014-01-01

    In this paper we show an example of how to use a computational simulation to obtain visual feedback for students' mental models, and compare their predictions with the simulated system's behaviour. Additionally, we use the computational simulation to incrementally modify the students' mental models in order to accommodate new data,…

  6. Computer-aided simulation study of photomultiplier tubes

    NASA Technical Reports Server (NTRS)

    Zaghloul, Mona E.; Rhee, Do Jun

    1989-01-01

    A computer model that simulates the response of photomultiplier tubes (PMTs) and the associated voltage divider circuit is developed. An equivalent circuit that approximates the operation of the device is derived and then used to develop a computer simulation of the PMT. Simulation results are presented and discussed.

  7. Computer simulation and the features of novel empirical data.

    PubMed

    Lusk, Greg

    2016-04-01

    In an attempt to determine the epistemic status of computer simulation results, philosophers of science have recently explored the similarities and differences between computer simulations and experiments. One question that arises is whether and, if so, when, simulation results constitute novel empirical data. It is often supposed that computer simulation results could never be empirical or novel because simulations never interact with their targets, and cannot go beyond their programming. This paper argues against this position by examining whether, and under what conditions, the features of empiricality and novelty could be displayed by computer simulation data. I show that, to the extent that certain familiar measurement results have these features, so can some computer simulation results. PMID:27083094

  8. An agent-based approach to financial stylized facts

    NASA Astrophysics Data System (ADS)

    Shimokawa, Tetsuya; Suzuki, Kyoko; Misawa, Tadanobu

    2007-06-01

    An important challenge of the financial theory in recent years is to construct more sophisticated models which have consistencies with as many financial stylized facts that cannot be explained by traditional models. Recently, psychological studies on decision making under uncertainty which originate in Kahneman and Tversky's research attract a lot of interest as key factors which figure out the financial stylized facts. These psychological results have been applied to the theory of investor's decision making and financial equilibrium modeling. This paper, following these behavioral financial studies, would like to propose an agent-based equilibrium model with prospect theoretical features of investors. Our goal is to point out a possibility that loss-averse feature of investors explains vast number of financial stylized facts and plays a crucial role in price formations of financial markets. Price process which is endogenously generated through our model has consistencies with, not only the equity premium puzzle and the volatility puzzle, but great kurtosis, asymmetry of return distribution, auto-correlation of return volatility, cross-correlation between return volatility and trading volume. Moreover, by using agent-based simulations, the paper also provides a rigorous explanation from the viewpoint of a lack of market liquidity to the size effect, which means that small-sized stocks enjoy excess returns compared to large-sized stocks.

  9. Exploring the Use of Computer Simulations in Unraveling Research and Development Governance Problems

    NASA Technical Reports Server (NTRS)

    Balaban, Mariusz A.; Hester, Patrick T.

    2012-01-01

    Understanding Research and Development (R&D) enterprise relationships and processes at a governance level is not a simple task, but valuable decision-making insight and evaluation capabilities can be gained from their exploration through computer simulations. This paper discusses current Modeling and Simulation (M&S) methods, addressing their applicability to R&D enterprise governance. Specifically, the authors analyze advantages and disadvantages of the four methodologies used most often by M&S practitioners: System Dynamics (SO), Discrete Event Simulation (DES), Agent Based Modeling (ABM), and formal Analytic Methods (AM) for modeling systems at the governance level. Moreover, the paper describes nesting models using a multi-method approach. Guidance is provided to those seeking to employ modeling techniques in an R&D enterprise for the purposes of understanding enterprise governance. Further, an example is modeled and explored for potential insight. The paper concludes with recommendations regarding opportunities for concentration of future work in modeling and simulating R&D governance relationships and processes.

  10. Computer simulation of FCC riser reactors.

    SciTech Connect

    Chang, S. L.; Golchert, B.; Lottes, S. A.; Petrick, M.; Zhou, C. Q.

    1999-04-20

    A three-dimensional computational fluid dynamics (CFD) code, ICRKFLO, was developed to simulate the multiphase reacting flow system in a fluid catalytic cracking (FCC) riser reactor. The code solve flow properties based on fundamental conservation laws of mass, momentum, and energy for gas, liquid, and solid phases. Useful phenomenological models were developed to represent the controlling FCC processes, including droplet dispersion and evaporation, particle-solid interactions, and interfacial heat transfer between gas, droplets, and particles. Techniques were also developed to facilitate numerical calculations. These techniques include a hybrid flow-kinetic treatment to include detailed kinetic calculations, a time-integral approach to overcome numerical stiffness problems of chemical reactions, and a sectional coupling and blocked-cell technique for handling complex geometry. The copyrighted ICRKFLO software has been validated with experimental data from pilot- and commercial-scale FCC units. The code can be used to evaluate the impacts of design and operating conditions on the production of gasoline and other oil products.

  11. Computational simulation of liquid rocket injector anomalies

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Singhal, A. K.; Tam, L. T.; Davidian, K.

    1986-01-01

    A computer model has been developed to analyze the three-dimensional two-phase reactive flows in liquid fueled rocket combustors. The model is designed to study the influence of liquid propellant injection nonuniformities on the flow pattern, combustion and heat transfer within the combustor. The Eulerian-Lagrangian approach for simulating polidisperse spray flow, evaporation and combustion has been used. Full coupling between the phases is accounted for. A nonorthogonal, body fitted coordinate system along with a conservative control volume formulation is employed. The physical models built into the model include a kappa-epsilon turbulence model, a two-step chemical reaction, and the six-flux radiation model. Semiempirical models are used to describe all interphase coupling terms as well as chemical reaction rates. The purpose of this study was to demonstrate an analytical capability to predict the effects of reactant injection nonuniformities (injection anomalies) on combustion and heat transfer within the rocket combustion chamber. The results show promising application of the model to comprehensive modeling of liquid propellant rocket engines.

  12. Quantitative and Qualitative Simulation in Computer Based Training.

    ERIC Educational Resources Information Center

    Stevens, Albert; Roberts, Burce

    1983-01-01

    Computer-based systems combining quantitative simulation with qualitative tutorial techniques provide learners with sophisticated individualized training. The teaching capabilities and operating procedures of Steamer, a simulated steam plant, are described. (Author/MBR)

  13. Simulating complex intracellular processes using object-oriented computational modelling.

    PubMed

    Johnson, Colin G; Goldman, Jacki P; Gullick, William J

    2004-11-01

    The aim of this paper is to give an overview of computer modelling and simulation in cellular biology, in particular as applied to complex biochemical processes within the cell. This is illustrated by the use of the techniques of object-oriented modelling, where the computer is used to construct abstractions of objects in the domain being modelled, and these objects then interact within the computer to simulate the system and allow emergent properties to be observed. The paper also discusses the role of computer simulation in understanding complexity in biological systems, and the kinds of information which can be obtained about biology via simulation. PMID:15302205

  14. Computer Simulation Methods for Defect Configurations and Nanoscale Structures

    SciTech Connect

    Gao, Fei

    2010-01-01

    This chapter will describe general computer simulation methods, including ab initio calculations, molecular dynamics and kinetic Monte-Carlo method, and their applications to the calculations of defect configurations in various materials (metals, ceramics and oxides) and the simulations of nanoscale structures due to ion-solid interactions. The multiscale theory, modeling, and simulation techniques (both time scale and space scale) will be emphasized, and the comparisons between computer simulation results and exprimental observations will be made.

  15. Emergence of a snake-like structure in mobile distributed agents: an exploratory agent-based modeling approach.

    PubMed

    Niazi, Muaz A

    2014-01-01

    The body structure of snakes is composed of numerous natural components thereby making it resilient, flexible, adaptive, and dynamic. In contrast, current computer animations as well as physical implementations of snake-like autonomous structures are typically designed to use either a single or a relatively smaller number of components. As a result, not only these artificial structures are constrained by the dimensions of the constituent components but often also require relatively more computationally intensive algorithms to model and animate. Still, these animations often lack life-like resilience and adaptation. This paper presents a solution to the problem of modeling snake-like structures by proposing an agent-based, self-organizing algorithm resulting in an emergent and surprisingly resilient dynamic structure involving a minimal of interagent communication. Extensive simulation experiments demonstrate the effectiveness as well as resilience of the proposed approach. The ideas originating from the proposed algorithm can not only be used for developing self-organizing animations but can also have practical applications such as in the form of complex, autonomous, evolvable robots with self-organizing, mobile components with minimal individual computational capabilities. The work also demonstrates the utility of exploratory agent-based modeling (EABM) in the engineering of artificial life-like complex adaptive systems. PMID:24701135

  16. An Agent-Based Model for Studying Child Maltreatment and Child Maltreatment Prevention

    NASA Astrophysics Data System (ADS)

    Hu, Xiaolin; Puddy, Richard W.

    This paper presents an agent-based model that simulates the dynamics of child maltreatment and child maltreatment prevention. The developed model follows the principles of complex systems science and explicitly models a community and its families with multi-level factors and interconnections across the social ecology. This makes it possible to experiment how different factors and prevention strategies can affect the rate of child maltreatment. We present the background of this work and give an overview of the agent-based model and show some simulation results.

  17. Simulation of reliability in multiserver computer networks

    NASA Astrophysics Data System (ADS)

    Minkevičius, Saulius

    2012-11-01

    The performance in terms of reliability of computer multiserver networks motivates this paper. The probability limit theorem is derived on the extreme queue length in open multiserver queueing networks in heavy traffic and applied to a reliability model for multiserver computer networks where we relate the time of failure of a multiserver computer network to the system parameters.

  18. Computational simulations of vorticity enhanced diffusion

    NASA Astrophysics Data System (ADS)

    Vold, Erik L.

    1999-11-01

    Computer simulations are used to investigate a phenomenon of vorticity enhanced diffusion (VED), a net transport and mixing of a passive scalar across a prescribed vortex flow field driven by a background gradient in the scalar quantity. The central issue under study here is the increase in scalar flux down the gradient and across the vortex field. The numerical scheme uses cylindrical coordinates centered with the vortex flow which allows an exact advective solution and 1D or 2D diffusion using simple numerical methods. In the results, the ratio of transport across a localized vortex region in the presence of the vortex flow over that expected for diffusion alone is evaluated as a measure of VED. This ratio is seen to increase dramatically while the absolute flux across the vortex decreases slowly as the diffusion coefficient is decreased. Similar results are found and compared for varying diffusion coefficient, D, or vortex rotation time, τv, for a constant background gradient in the transported scalar vs an interface in the transported quantity, and for vortex flow fields constant in time vs flow which evolves in time from an initial state and with a Schmidt number of order unity. A simple analysis shows that for a small diffusion coefficient, the flux ratio measure of VED scales as the vortex radius over the thickness for mass diffusion in a viscous shear layer within the vortex characterized by (Dτv)1/2. The phenomenon is linear as investigated here and suggests that a significant enhancement of mixing in fluids may be a relatively simple linear process. Discussion touches on how this vorticity enhanced diffusion may be related to mixing in nonlinear turbulent flows.

  19. Validating agent based models through virtual worlds.

    SciTech Connect

    Lakkaraju, Kiran; Whetzel, Jonathan H.; Lee, Jina; Bier, Asmeret Brooke; Cardona-Rivera, Rogelio E.; Bernstein, Jeremy Ray Rhythm

    2014-01-01

    As the US continues its vigilance against distributed, embedded threats, understanding the political and social structure of these groups becomes paramount for predicting and dis- rupting their attacks. Agent-based models (ABMs) serve as a powerful tool to study these groups. While the popularity of social network tools (e.g., Facebook, Twitter) has provided extensive communication data, there is a lack of ne-grained behavioral data with which to inform and validate existing ABMs. Virtual worlds, in particular massively multiplayer online games (MMOG), where large numbers of people interact within a complex environ- ment for long periods of time provide an alternative source of data. These environments provide a rich social environment where players engage in a variety of activities observed between real-world groups: collaborating and/or competing with other groups, conducting battles for scarce resources, and trading in a market economy. Strategies employed by player groups surprisingly re ect those seen in present-day con icts, where players use diplomacy or espionage as their means for accomplishing their goals. In this project, we propose to address the need for ne-grained behavioral data by acquiring and analyzing game data a commercial MMOG, referred to within this report as Game X. The goals of this research were: (1) devising toolsets for analyzing virtual world data to better inform the rules that govern a social ABM and (2) exploring how virtual worlds could serve as a source of data to validate ABMs established for analogous real-world phenomena. During this research, we studied certain patterns of group behavior to compliment social modeling e orts where a signi cant lack of detailed examples of observed phenomena exists. This report outlines our work examining group behaviors that underly what we have termed the Expression-To-Action (E2A) problem: determining the changes in social contact that lead individuals/groups to engage in a particular behavior

  20. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.; Ziegler, C.

    1983-01-01

    A software simulator to help NASA in the design of the LMSS was developed. The simulator will be used to study the characteristics of implementation requirements of the LMSS's configuration with specifications as outlined by NASA.

  1. Hypercompetitive Environments: An Agent-based model approach

    NASA Astrophysics Data System (ADS)

    Dias, Manuel; Araújo, Tanya

    Information technology (IT) environments are characterized by complex changes and rapid evolution. Globalization and the spread of technological innovation have increased the need for new strategic information resources, both from individual firms and management environments. Improvements in multidisciplinary methods and, particularly, the availability of powerful computational tools, are giving researchers an increasing opportunity to investigate management environments in their true complex nature. The adoption of a complex systems approach allows for modeling business strategies from a bottom-up perspective — understood as resulting from repeated and local interaction of economic agents — without disregarding the consequences of the business strategies themselves to individual behavior of enterprises, emergence of interaction patterns between firms and management environments. Agent-based models are at the leading approach of this attempt.

  2. Simulation of the stress computation in shells

    NASA Technical Reports Server (NTRS)

    Salama, M.; Utku, S.

    1978-01-01

    A self-teaching computer program is described, whereby the stresses in thin shells can be computed with good accuracy using the best fit approach. The program is designed for use in interactive game mode to allow the structural engineer to learn about (1) the major sources of difficulties and associated errors in the computation of stresses in thin shells, (2) possible ways to reduce the errors, and (3) trade-off between computational cost and accuracy. Included are derivation of the computational approach, program description, and several examples illustrating the program usage.

  3. Convergence and optimization of agent-based coalition formation

    NASA Astrophysics Data System (ADS)

    Wang, Yuanshi; Wu, Hong

    2005-03-01

    In this paper, we analyze the model of agent-based coalition formation in markets. Our goal is to study the convergence of the coalition formation and optimize agents’ strategies. We show that the model has a unique steady state (equilibrium) and prove that all solutions converge to it in the case that the maximum size of coalitions is not larger than three. The stability of the steady state in other cases is not studied while numerical simulations are given to show the convergence. The steady state, which determines both the global system gain and the average gain per agent, is expressed by the agents’ strategies in the coalition formation. Through the steady state, we give the relationship between the gains and the agents’ strategies, and present a series of results for the optimization of agents’ strategies.

  4. Agent-based modeling in ecological economics.

    PubMed

    Heckbert, Scott; Baynes, Tim; Reeson, Andrew

    2010-01-01

    Interconnected social and environmental systems are the domain of ecological economics, and models can be used to explore feedbacks and adaptations inherent in these systems. Agent-based modeling (ABM) represents autonomous entities, each with dynamic behavior and heterogeneous characteristics. Agents interact with each other and their environment, resulting in emergent outcomes at the macroscale that can be used to quantitatively analyze complex systems. ABM is contributing to research questions in ecological economics in the areas of natural resource management and land-use change, urban systems modeling, market dynamics, changes in consumer attitudes, innovation, and diffusion of technology and management practices, commons dilemmas and self-governance, and psychological aspects to human decision making and behavior change. Frontiers for ABM research in ecological economics involve advancing the empirical calibration and validation of models through mixed methods, including surveys, interviews, participatory modeling, and, notably, experimental economics to test specific decision-making hypotheses. Linking ABM with other modeling techniques at the level of emergent properties will further advance efforts to understand dynamics of social-environmental systems. PMID:20146761

  5. Agent Based Model of Livestock Movements

    NASA Astrophysics Data System (ADS)

    Miron, D. J.; Emelyanova, I. V.; Donald, G. E.; Garner, G. M.

    The modelling of livestock movements within Australia is of national importance for the purposes of the management and control of exotic disease spread, infrastructure development and the economic forecasting of livestock markets. In this paper an agent based model for the forecasting of livestock movements is presented. This models livestock movements from farm to farm through a saleyard. The decision of farmers to sell or buy cattle is often complex and involves many factors such as climate forecast, commodity prices, the type of farm enterprise, the number of animals available and associated off-shore effects. In this model the farm agent's intelligence is implemented using a fuzzy decision tree that utilises two of these factors. These two factors are the livestock price fetched at the last sale and the number of stock on the farm. On each iteration of the model farms choose either to buy, sell or abstain from the market thus creating an artificial supply and demand. The buyers and sellers then congregate at the saleyard where livestock are auctioned using a second price sealed bid. The price time series output by the model exhibits properties similar to those found in real livestock markets.

  6. Cognitive Effects from Process Learning with Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Breuer, Klaus; Kummer, Ruediger

    1990-01-01

    Discusses content learning versus process learning, describes process learning with computer-based simulations, and highlights an empirical study on the effects of process learning with problem-oriented, computer-managed simulations in technical vocational education classes in West Germany. Process learning within a model of the cognitive system…

  7. Computer Simulation Models of Economic Systems in Higher Education.

    ERIC Educational Resources Information Center

    Smith, Lester Sanford

    The increasing complexity of educational operations make analytical tools, such as computer simulation models, especially desirable for educational administrators. This MA thesis examined the feasibility of developing computer simulation models of economic systems in higher education to assist decision makers in allocating resources. The report…

  8. Explore Effective Use of Computer Simulations for Physics Education

    ERIC Educational Resources Information Center

    Lee, Yu-Fen; Guo, Yuying

    2008-01-01

    The dual purpose of this article is to provide a synthesis of the findings related to the use of computer simulations in physics education and to present implications for teachers and researchers in science education. We try to establish a conceptual framework for the utilization of computer simulations as a tool for learning and instruction in…

  9. The Link between Computer Simulations and Social Studies Learning: Debriefing.

    ERIC Educational Resources Information Center

    Chiodo, John J.; Flaim, Mary L.

    1993-01-01

    Asserts that debriefing is the missing link between learning achievement and simulations in social studies. Maintains that teachers who employ computer-assisted instruction must utilize effective debriefing activities. Provides a four-step debriefing model using the computer simulation, Oregon Trail. (CFR)

  10. The Role of Computer Simulations in Engineering Education.

    ERIC Educational Resources Information Center

    Smith, P. R.; Pollard, D.

    1986-01-01

    Discusses role of computer simulation in complementing and extending conventional components of undergraduate engineering education process in United Kingdom universities and polytechnics. Aspects of computer-based learning are reviewed (laboratory simulation, lecture and tutorial support, inservice teacher education) with reference to programs in…

  11. How Effective Is Instructional Support for Learning with Computer Simulations?

    ERIC Educational Resources Information Center

    Eckhardt, Marc; Urhahne, Detlef; Conrad, Olaf; Harms, Ute

    2013-01-01

    The study examined the effects of two different instructional interventions as support for scientific discovery learning using computer simulations. In two well-known categories of difficulty, data interpretation and self-regulation, instructional interventions for learning with computer simulations on the topic "ecosystem water" were developed…

  12. Evaluation of Computer Simulations for Teaching Apparel Merchandising Concepts.

    ERIC Educational Resources Information Center

    Jolly, Laura D.; Sisler, Grovalynn

    1988-01-01

    The study developed and evaluated computer simulations for teaching apparel merchandising concepts. Evaluation results indicated that teaching method (computer simulation versus case study) does not significantly affect cognitive learning. Student attitudes varied, however, according to topic (profitable merchandising analysis versus retailing…

  13. New Pedagogies on Teaching Science with Computer Simulations

    ERIC Educational Resources Information Center

    Khan, Samia

    2011-01-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1)…

  14. Computers for real time flight simulation: A market survey

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.; Karplus, W. J.

    1977-01-01

    An extensive computer market survey was made to determine those available systems suitable for current and future flight simulation studies at Ames Research Center. The primary requirement is for the computation of relatively high frequency content (5 Hz) math models representing powered lift flight vehicles. The Rotor Systems Research Aircraft (RSRA) was used as a benchmark vehicle for computation comparison studies. The general nature of helicopter simulations and a description of the benchmark model are presented, and some of the sources of simulation difficulties are examined. A description of various applicable computer architectures is presented, along with detailed discussions of leading candidate systems and comparisons between them.

  15. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  16. Parameter estimation and sensitivity analysis in an agent-based model of Leishmania major infection

    PubMed Central

    Jones, Douglas E.; Dorman, Karin S.

    2009-01-01

    Computer models of disease take a systems biology approach toward understanding host-pathogen interactions. In particular, data driven computer model calibration is the basis for inference of immunological and pathogen parameters, assessment of model validity, and comparison between alternative models of immune or pathogen behavior. In this paper we describe the calibration and analysis of an agent-based model of Leishmania major infection. A model of macrophage loss following uptake of necrotic tissue is proposed to explain macrophage depletion following peak infection. Using Gaussian processes to approximate the computer code, we perform a sensitivity analysis to identify important parameters and to characterize their influence on the simulated infection. The analysis indicates that increasing growth rate can favor or suppress pathogen loads, depending on the infection stage and the pathogen’s ability to avoid detection. Subsequent calibration of the model against previously published biological observations suggests that L. major has a relatively slow growth rate and can replicate for an extended period of time before damaging the host cell. PMID:19837088

  17. GPU-accelerated micromagnetic simulations using cloud computing

    NASA Astrophysics Data System (ADS)

    Jermain, C. L.; Rowlands, G. E.; Buhrman, R. A.; Ralph, D. C.

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics.

  18. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    ERIC Educational Resources Information Center

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  19. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  20. Computer simulated plant design for waste minimization/pollution prevention

    SciTech Connect

    Bumble, S.

    2000-07-01

    The book discusses several paths to pollution prevention and waste minimization by using computer simulation programs. It explains new computer technologies used in the field of pollution prevention and waste management; provides information pertaining to overcoming technical, economic, and environmental barriers to waste reduction; gives case-studies from industries; and covers computer aided flow sheet design and analysis for nuclear fuel reprocessing.

  1. Creating Science Simulations through Computational Thinking Patterns

    ERIC Educational Resources Information Center

    Basawapatna, Ashok Ram

    2012-01-01

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction.…

  2. Computer Simulations as an Integral Part of Intermediate Macroeconomics.

    ERIC Educational Resources Information Center

    Millerd, Frank W.; Robertson, Alastair R.

    1987-01-01

    Describes the development of two interactive computer simulations which were fully integrated with other course materials. The simulations illustrate the effects of various real and monetary "demand shocks" on aggregate income, interest rates, and components of spending and economic output. Includes an evaluation of the simulations' effects on…

  3. Genetic Crossing vs Cloning by Computer Simulation

    NASA Astrophysics Data System (ADS)

    Dasgupta, Subinay

    We perform Monte Carlo simulation using Penna's bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  4. Genetic crossing vs cloning by computer simulation

    SciTech Connect

    Dasgupta, S.

    1997-06-01

    We perform Monte Carlo simulation using Penna`s bit string model, and compare the process of asexual reproduction by cloning with that by genetic crossover. We find them to be comparable as regards survival of a species, and also if a natural disaster is simulated.

  5. Spatial Learning and Computer Simulations in Science

    ERIC Educational Resources Information Center

    Lindgren, Robb; Schwartz, Daniel L.

    2009-01-01

    Interactive simulations are entering mainstream science education. Their effects on cognition and learning are often framed by the legacy of information processing, which emphasized amodal problem solving and conceptual organization. In contrast, this paper reviews simulations from the vantage of research on perception and spatial learning,…

  6. Agent-Based Learning Environments as a Research Tool for Investigating Teaching and Learning.

    ERIC Educational Resources Information Center

    Baylor, Amy L.

    2002-01-01

    Discusses intelligent learning environments for computer-based learning, such as agent-based learning environments, and their advantages over human-based instruction. Considers the effects of multiple agents; agents and research design; the use of Multiple Intelligent Mentors Instructing Collaboratively (MIMIC) for instructional design for…

  7. Permutations of Control: Cognitive Considerations for Agent-Based Learning Environments.

    ERIC Educational Resources Information Center

    Baylor, Amy L.

    2001-01-01

    Discussion of intelligent agents and their use in computer learning environments focuses on cognitive considerations. Presents four dimension of control that should be considered in designing agent-based learning environments: learner control, from constructivist to instructivist; feedback; relationship of learner to agent; and learner confidence…

  8. Numerical Problems and Agent-Based Models for a Mass Transfer Course

    ERIC Educational Resources Information Center

    Murthi, Manohar; Shea, Lonnie D.; Snurr, Randall Q.

    2009-01-01

    Problems requiring numerical solutions of differential equations or the use of agent-based modeling are presented for use in a course on mass transfer. These problems were solved using the popular technical computing language MATLABTM. Students were introduced to MATLAB via a problem with an analytical solution. A more complex problem to which no…

  9. High Fidelity Simulation of a Computer Room

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim; Chan, William; Chaderjian, Neal; Pandya, Shishir

    2005-01-01

    This viewgraph presentation reviews NASA's Columbia supercomputer and the mesh technology used to test the adequacy of the fluid and cooling of a computer room. A technical description of the Columbia supercomputer is also presented along with its performance capability.

  10. Some theoretical issues on computer simulations

    SciTech Connect

    Barrett, C.L.; Reidys, C.M.

    1998-02-01

    The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.

  11. Use of advanced computers for aerodynamic flow simulation

    NASA Technical Reports Server (NTRS)

    Bailey, F. R.; Ballhaus, W. F.

    1980-01-01

    The current and projected use of advanced computers for large-scale aerodynamic flow simulation applied to engineering design and research is discussed. The design use of mature codes run on conventional, serial computers is compared with the fluid research use of new codes run on parallel and vector computers. The role of flow simulations in design is illustrated by the application of a three dimensional, inviscid, transonic code to the Sabreliner 60 wing redesign. Research computations that include a more complete description of the fluid physics by use of Reynolds averaged Navier-Stokes and large-eddy simulation formulations are also presented. Results of studies for a numerical aerodynamic simulation facility are used to project the feasibility of design applications employing these more advanced three dimensional viscous flow simulations.

  12. Parallel Computing Environments and Methods for Power Distribution System Simulation

    SciTech Connect

    Lu, Ning; Taylor, Zachary T.; Chassin, David P.; Guttromson, Ross T.; Studham, Scott S.

    2005-11-10

    The development of cost-effective high-performance parallel computing on multi-processor super computers makes it attractive to port excessively time consuming simulation software from personal computers (PC) to super computes. The power distribution system simulator (PDSS) takes a bottom-up approach and simulates load at appliance level, where detailed thermal models for appliances are used. This approach works well for a small power distribution system consisting of a few thousand appliances. When the number of appliances increases, the simulation uses up the PC memory and its run time increases to a point where the approach is no longer feasible to model a practical large power distribution system. This paper presents an effort made to port a PC-based power distribution system simulator (PDSS) to a 128-processor shared-memory super computer. The paper offers an overview of the parallel computing environment and a description of the modification made to the PDSS model. The performances of the PDSS running on a standalone PC and on the super computer are compared. Future research direction of utilizing parallel computing in the power distribution system simulation is also addressed.

  13. Computational field simulation of temporally deforming geometries

    SciTech Connect

    Boyalakuntla, K.; Soni, B.K.; Thornburg, H.J.

    1996-12-31

    A NURBS based moving grid generation technique is presented to simulate temporally deforming geometries. Grid generation for a complex configuration can be a time consuming process and temporally varying geometries necessitate the regeneration of such a grid for every time step. The Non Uniform Rational B Spline (NURBS) based control point information is used for geometry description. The parametric definition of the NURBS is utilized in the development of the methodology to generate well distributed grid in a timely manner. The numerical simulation involving temporally deforming geometry is accomplished by appropriately linking to a unsteady, multi-block, thin layer Navier-Stokes solver. The present method greatly reduces CPU requirements for time dependent remeshing, facilitating the simulation of more complex unsteady problems. This current effort is the first step towards multidisciplinary design optimization, which involves coupling aerodynamic heat transfer and structural analysis. Applications include simulation of temporally deforming bodies.

  14. Computer simulation of water reclamation processors

    NASA Technical Reports Server (NTRS)

    Fisher, John W.; Hightower, T. M.; Flynn, Michael T.

    1991-01-01

    The development of detailed simulation models of water reclamation processors based on the ASPEN PLUS simulation program is discussed. Individual models have been developed for vapor compression distillation, vapor phase catalytic ammonia removal, and supercritical water oxidation. These models are used for predicting the process behavior. Particular attention is given to methodology which is used to complete this work, and the insights which are gained by this type of model development.

  15. Improving Agent Based Models and Validation through Data Fusion

    PubMed Central

    Laskowski, Marek; Demianyk, Bryan C.P.; Friesen, Marcia R.; McLeod, Robert D.; Mukhi, Shamir N.

    2011-01-01

    This work is contextualized in research in modeling and simulation of infection spread within a community or population, with the objective to provide a public health and policy tool in assessing the dynamics of infection spread and the qualitative impacts of public health interventions. This work uses the integration of real data sources into an Agent Based Model (ABM) to simulate respiratory infection spread within a small municipality. Novelty is derived in that the data sources are not necessarily obvious within ABM infection spread models. The ABM is a spatial-temporal model inclusive of behavioral and interaction patterns between individual agents on a real topography. The agent behaviours (movements and interactions) are fed by census / demographic data, integrated with real data from a telecommunication service provider (cellular records) and person-person contact data obtained via a custom 3G Smartphone application that logs Bluetooth connectivity between devices. Each source provides data of varying type and granularity, thereby enhancing the robustness of the model. The work demonstrates opportunities in data mining and fusion that can be used by policy and decision makers. The data become real-world inputs into individual SIR disease spread models and variants, thereby building credible and non-intrusive models to qualitatively simulate and assess public health interventions at the population level. PMID:23569606

  16. Estimating computer communication network performance using network simulations

    SciTech Connect

    Garcia, A.B.

    1985-01-01

    A generalized queuing model simulation of store-and-forward computer communication networks is developed and implemented using Simulation Language for Alternative Modeling (SLAM). A baseline simulation model is validated by comparison with published analytic models. The baseline model is expanded to include an ACK/NAK data link protocol, four-level message precedence, finite queues, and a response traffic scenario. Network performance, as indicated by average message delay and message throughput, is estimated using the simulation model.

  17. Computer simulation results of attitude estimation of earth orbiting satellites

    NASA Technical Reports Server (NTRS)

    Kou, S. R.

    1976-01-01

    Computer simulation results of attitude estimation of Earth-orbiting satellites (including Space Telescope) subjected to environmental disturbances and noises are presented. Decomposed linear recursive filter and Kalman filter were used as estimation tools. Six programs were developed for this simulation, and all were written in the basic language and were run on HP 9830A and HP 9866A computers. Simulation results show that a decomposed linear recursive filter is accurate in estimation and fast in response time. Furthermore, for higher order systems, this filter has computational advantages (i.e., less integration errors and roundoff errors) over a Kalman filter.

  18. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  19. Avoiding pitfalls in simulating real-time computer systems

    NASA Technical Reports Server (NTRS)

    Smith, R. S.

    1984-01-01

    The software simulation of a computer target system on a computer host system, known as an interpretive computer simulator (ICS), functionally models and implements the action of the target hardware. For an ICS to function as efficiently as possible and to avoid certain pitfalls in designing an ICS, it is important that the details of the hardware architectural design of both the target and the host computers be known. This paper discusses both host selection considerations and ICS design features that, without proper consideration, could make the resulting ICS too slow to use or too costly to maintain and expand.

  20. Pattern-Oriented Modeling of Agent-Based Complex Systems: Lessons from Ecology

    NASA Astrophysics Data System (ADS)

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-11-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  1. Demeter, persephone, and the search for emergence in agent-based models.

    SciTech Connect

    North, M. J.; Howe, T. R.; Collier, N. T.; Vos, J. R.; Decision and Information Sciences; Univ. of Chicago; PantaRei Corp.; Univ. of Illinois

    2006-01-01

    In Greek mythology, the earth goddess Demeter was unable to find her daughter Persephone after Persephone was abducted by Hades, the god of the underworld. Demeter is said to have embarked on a long and frustrating, but ultimately successful, search to find her daughter. Unfortunately, long and frustrating searches are not confined to Greek mythology. In modern times, agent-based modelers often face similar troubles when searching for agents that are to be to be connected to one another and when seeking appropriate target agents while defining agent behaviors. The result is a 'search for emergence' in that many emergent or potentially emergent behaviors in agent-based models of complex adaptive systems either implicitly or explicitly require search functions. This paper considers a new nested querying approach to simplifying such agent-based modeling and multi-agent simulation search problems.

  2. Pattern-oriented modeling of agent-based complex systems: Lessons from ecology

    USGS Publications Warehouse

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M.; Railsback, Steven F.; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L.

    2005-01-01

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity.

  3. Pattern-oriented modeling of agent-based complex systems: lessons from ecology.

    PubMed

    Grimm, Volker; Revilla, Eloy; Berger, Uta; Jeltsch, Florian; Mooij, Wolf M; Railsback, Steven F; Thulke, Hans-Hermann; Weiner, Jacob; Wiegand, Thorsten; DeAngelis, Donald L

    2005-11-11

    Agent-based complex systems are dynamic networks of many interacting agents; examples include ecosystems, financial markets, and cities. The search for general principles underlying the internal organization of such systems often uses bottom-up simulation models such as cellular automata and agent-based models. No general framework for designing, testing, and analyzing bottom-up models has yet been established, but recent advances in ecological modeling have come together in a general strategy we call pattern-oriented modeling. This strategy provides a unifying framework for decoding the internal organization of agent-based complex systems and may lead toward unifying algorithmic theories of the relation between adaptive behavior and system complexity. PMID:16284171

  4. Icing simulation: A survey of computer models and experimental facilities

    NASA Technical Reports Server (NTRS)

    Potapczuk, M. G.; Reinmann, J. J.

    1991-01-01

    A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.

  5. Two inviscid computational simulations of separated flow about airfoils

    NASA Technical Reports Server (NTRS)

    Barnwell, R. W.

    1976-01-01

    Two inviscid computational simulations of separated flow about airfoils are described. The basic computational method is the line relaxation finite-difference method. Viscous separation is approximated with inviscid free-streamline separation. The point of separation is specified, and the pressure in the separation region is calculated. In the first simulation, the empiricism of constant pressure in the separation region is employed. This empiricism is easier to implement with the present method than with singularity methods. In the second simulation, acoustic theory is used to determine the pressure in the separation region. The results of both simulations are compared with experiment.

  6. Computer simulator for a mobile telephone system

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1981-01-01

    A software simulator was developed to assist NASA in the design of the land mobile satellite service. Structured programming techniques were used by developing the algorithm using an ALCOL-like pseudo language and then encoding the algorithm into FORTRAN 4. The basic input data to the system is a sine wave signal although future plans call for actual sampled voice as the input signal. The simulator is capable of studying all the possible combinations of types and modes of calls through the use of five communication scenarios: single hop systems; double hop, signal gateway system; double hop, double gateway system; mobile to wireline system; and wireline to mobile system. The transmitter, fading channel, and interference source simulation are also discussed.

  7. Access Control for Agent-based Computing: A Distributed Approach.

    ERIC Educational Resources Information Center

    Antonopoulos, Nick; Koukoumpetsos, Kyriakos; Shafarenko, Alex

    2001-01-01

    Discusses the mobile software agent paradigm that provides a foundation for the development of high performance distributed applications and presents a simple, distributed access control architecture based on the concept of distributed, active authorization entities (lock cells), any combination of which can be referenced by an agent to provide…

  8. An Active Learning Exercise for Introducing Agent-Based Modeling

    ERIC Educational Resources Information Center

    Pinder, Jonathan P.

    2013-01-01

    Recent developments in agent-based modeling as a method of systems analysis and optimization indicate that students in business analytics need an introduction to the terminology, concepts, and framework of agent-based modeling. This article presents an active learning exercise for MBA students in business analytics that demonstrates agent-based…

  9. Computer simulation of bounded plasma systems

    SciTech Connect

    Lawson, W.S.

    1987-03-05

    The physical and numerical problems of kinetic simulation of a bounded electrostatic plasma system in one planar dimension are examined, and solutions to them are presented. These problems include particle absorption, reflection and emission at boundaries, the solution of Poisson's equation under non-periodic boundary conditions, and the treatment of an external circuit connecting the boundaries. Some comments are also made regarding the problems of higher dimensions. The methods which are described here are implemented in a code named PDW1, which is available from Professor C.K. Birdsall, Plasma Theory and Simulation Group, Cory Hall, University of California, Berkeley, CA 94720.

  10. Computer Simulations in the Science Classroom.

    ERIC Educational Resources Information Center

    Richards, John; And Others

    1992-01-01

    Explorer is an interactive environment based on a constructivist epistemology of learning that integrates animated computer models with analytic capabilities for learning science. The system includes graphs, a spreadsheet, scripting, and interactive tools. Two examples involving the dynamics of colliding objects and electric circuits illustrate…

  11. Computer Simulation of Electric Field Lines.

    ERIC Educational Resources Information Center

    Kirkup, L.

    1985-01-01

    Describes a computer program which plots electric field line plots. Includes program listing, sample diagrams produced on a BBC model B microcomputer (which could be produced on other microcomputers by modifying the program), and a discussion of the properties of field lines. (JN)

  12. Using Computer Simulations to Integrate Learning.

    ERIC Educational Resources Information Center

    Liao, Thomas T.

    1983-01-01

    Describes the primary design criteria and the classroom activities involved in "The Yellow Light Problem," a minicourse on decision making in the secondary school Mathematics, Engineering and Science Achievement (MESA) program in California. Activities include lectures, discussions, science and math labs, computer labs, and development of…

  13. BSim: an agent-based tool for modeling bacterial populations in systems and synthetic biology.

    PubMed

    Gorochowski, Thomas E; Matyjaszkiewicz, Antoni; Todd, Thomas; Oak, Neeraj; Kowalska, Kira; Reid, Stephen; Tsaneva-Atanasova, Krasimira T; Savery, Nigel J; Grierson, Claire S; di Bernardo, Mario

    2012-01-01

    Large-scale collective behaviors such as synchronization and coordination spontaneously arise in many bacterial populations. With systems biology attempting to understand these phenomena, and synthetic biology opening up the possibility of engineering them for our own benefit, there is growing interest in how bacterial populations are best modeled. Here we introduce BSim, a highly flexible agent-based computational tool for analyzing the relationships between single-cell dynamics and population level features. BSim includes reference implementations of many bacterial traits to enable the quick development of new models partially built from existing ones. Unlike existing modeling tools, BSim fully considers spatial aspects of a model allowing for the description of intricate micro-scale structures, enabling the modeling of bacterial behavior in more realistic three-dimensional, complex environments. The new opportunities that BSim opens are illustrated through several diverse examples covering: spatial multicellular computing, modeling complex environments, population dynamics of the lac operon, and the synchronization of genetic oscillators. BSim is open source software that is freely available from http://bsim-bccs.sf.net and distributed under the Open Source Initiative (OSI) recognized MIT license. Developer documentation and a wide range of example simulations are also available from the website. BSim requires Java version 1.6 or higher. PMID:22936991

  14. Numerical simulation of supersonic wake flow with parallel computers

    SciTech Connect

    Wong, C.C.; Soetrisno, M.

    1995-07-01

    Simulating a supersonic wake flow field behind a conical body is a computing intensive task. It requires a large number of computational cells to capture the dominant flow physics and a robust numerical algorithm to obtain a reliable solution. High performance parallel computers with unique distributed processing and data storage capability can provide this need. They have larger computational memory and faster computing time than conventional vector computers. We apply the PINCA Navier-Stokes code to simulate a wind-tunnel supersonic wake experiment on Intel Gamma, Intel Paragon, and IBM SP2 parallel computers. These simulations are performed to study the mean flow in the near wake region of a sharp, 7-degree half-angle, adiabatic cone at Mach number 4.3 and freestream Reynolds number of 40,600. Overall the numerical solutions capture the general features of the hypersonic laminar wake flow and compare favorably with the wind tunnel data. With a refined and clustering grid distribution in the recirculation zone, the calculated location of the rear stagnation point is consistent with the 2D axisymmetric and 3D experiments. In this study, we also demonstrate the importance of having a large local memory capacity within a computer node and the effective utilization of the number of computer nodes to achieve good parallel performance when simulating a complex, large-scale wake flow problem.

  15. Computer simulation of on-orbit manned maneuvering unit operations

    NASA Technical Reports Server (NTRS)

    Stuart, G. M.; Garcia, K. D.

    1986-01-01

    Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.

  16. Advanced Simulation and Computing Business Plan

    SciTech Connect

    Rummel, E.

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  17. Multidimensional computer simulation of Stirling cycle engines

    NASA Astrophysics Data System (ADS)

    Hall, Charles A.; Porsching, Thomas A.

    1992-07-01

    This report summarizes the activities performed under NASA-Grant NAG3-1097 during 1991. During that period, work centered on the following tasks: (1) to investigate more effective solvers for ALGAE; (2) to modify the plotting package for ALGAE; and (3) to validate ALGAE by simulating oscillating flow problems similar to those studied by Kurzweg and Ibrahim.

  18. Multidimensional computer simulation of Stirling cycle engines

    NASA Technical Reports Server (NTRS)

    Hall, Charles A.; Porsching, Thomas A.

    1992-01-01

    This report summarizes the activities performed under NASA-Grant NAG3-1097 during 1991. During that period, work centered on the following tasks: (1) to investigate more effective solvers for ALGAE; (2) to modify the plotting package for ALGAE; and (3) to validate ALGAE by simulating oscillating flow problems similar to those studied by Kurzweg and Ibrahim.

  19. Student Ecosystems Problem Solving Using Computer Simulation.

    ERIC Educational Resources Information Center

    Howse, Melissa A.

    The purpose of this study was to determine the procedural knowledge brought to, and created within, a pond ecology simulation by students. Environmental Decision Making (EDM) is an ecosystems modeling tool that allows users to pose their own problems and seek satisfying solutions. Of specific interest was the performance of biology majors who had…

  20. Process Training Derived from a Computer Simulation Theory

    ERIC Educational Resources Information Center

    Holzman, Thomas G.; And Others

    1976-01-01

    Discusses a study which investigated whether a computer simulation model could suggest subroutines that were instructable and whether instruction on these subroutines could facilitate subjects' solutions to the problem task. (JM)

  1. Development and Formative Evaluation of Computer Simulated College Chemistry Experiments.

    ERIC Educational Resources Information Center

    Cavin, Claudia S.; Cavin, E. D.

    1978-01-01

    This article describes the design, preparation, and initial evaluation of a set of computer-simulated chemistry experiments. The experiments entailed the use of an atomic emission spectroscope and a single-beam visible absorption spectrophometer. (Author/IRT)

  2. Computer Simulations as a Teaching Tool in Community Colleges

    ERIC Educational Resources Information Center

    Grimm, Floyd M., III

    1978-01-01

    Describes the implementation of a computer assisted instruction program at Harford Community College. Eight different biology simulation programs are used covering topics in ecology, genetics, biochemistry, and sociobiology. (MA)

  3. MINEXP, A Computer-Simulated Mineral Exploration Program

    ERIC Educational Resources Information Center

    Smith, Michael J.; And Others

    1978-01-01

    This computer simulation is designed to put students into a realistic decision making situation in mineral exploration. This program can be used with different exploration situations such as ore deposits, petroleum, ground water, etc. (MR)

  4. Issues in Computer Simulation in Military Maintenance Training

    ERIC Educational Resources Information Center

    Brock, John F.

    1978-01-01

    This article discusses the state of computer-based simulation, reviews the early phases of ISD, suggests that current ISD approaches are missing critical inputs, and proposes a research and development program. (Author)

  5. An Exercise in Biometrical Genetics Based on a Computer Simulation.

    ERIC Educational Resources Information Center

    Murphy, P. J.

    1983-01-01

    Describes an exercise in biometrical genetics based on the noninteractive use of a computer simulation of a wheat hydridization program. Advantages of using the material in this way are also discussed. (Author/JN)

  6. Synthesized Population Databases: A US Geospatial Database for Agent-Based Models

    PubMed Central

    Wheaton, William D.; Cajka, James C.; Chasteen, Bernadette M.; Wagener, Diane K.; Cooley, Philip C.; Ganapathi, Laxminarayana; Roberts, Douglas J.; Allpress, Justine L.

    2010-01-01

    Agent-based models simulate large-scale social systems. They assign behaviors and activities to “agents” (individuals) within the population being modeled and then allow the agents to interact with the environment and each other in complex simulations. Agent-based models are frequently used to simulate infectious disease outbreaks, among other uses. RTI used and extended an iterative proportional fitting method to generate a synthesized, geospatially explicit, human agent database that represents the US population in the 50 states and the District of Columbia in the year 2000. Each agent is assigned to a household; other agents make up the household occupants. For this database, RTI developed the methods for generating synthesized households and personsassigning agents to schools and workplaces so that complex interactions among agents as they go about their daily activities can be taken into accountgenerating synthesized human agents who occupy group quarters (military bases, college dormitories, prisons, nursing homes).In this report, we describe both the methods used to generate the synthesized population database and the final data structure and data content of the database. This information will provide researchers with the information they need to use the database in developing agent-based models. Portions of the synthesized agent database are available to any user upon request. RTI will extract a portion (a county, region, or state) of the database for users who wish to use this database in their own agent-based models. PMID:20505787

  7. GATE Monte Carlo simulation in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Rowedder, Blake Austin

    The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.

  8. E-laboratories : agent-based modeling of electricity markets.

    SciTech Connect

    North, M.; Conzelmann, G.; Koritarov, V.; Macal, C.; Thimmapuram, P.; Veselka, T.

    2002-05-03

    Electricity markets are complex adaptive systems that operate under a wide range of rules that span a variety of time scales. These rules are imposed both from above by society and below by physics. Many electricity markets are undergoing or are about to undergo a transition from centrally regulated systems to decentralized markets. Furthermore, several electricity markets have recently undergone this transition with extremely unsatisfactory results, most notably in California. These high stakes transitions require the introduction of largely untested regulatory structures. Suitable laboratories that can be used to test regulatory structures before they are applied to real systems are needed. Agent-based models can provide such electronic laboratories or ''e-laboratories.'' To better understand the requirements of an electricity market e-laboratory, a live electricity market simulation was created. This experience helped to shape the development of the Electricity Market Complex Adaptive Systems (EMCAS) model. To explore EMCAS' potential as an e-laboratory, several variations of the live simulation were created. These variations probed the possible effects of changing power plant outages and price setting rules on electricity market prices.

  9. An agent-based mathematical model about carp aggregation

    NASA Astrophysics Data System (ADS)

    Liang, Yu; Wu, Chao

    2005-05-01

    This work presents an agent-based mathematical model to simulate the aggregation of carp, a harmful fish in North America. The referred mathematical model is derived from the following assumptions: (1) instead of the consensus among every carps involved in the aggregation, the aggregation of carp is completely a random and spontaneous physical behavior of numerous of independent carp; (2) carp aggregation is a collective effect of inter-carp and carp-environment interaction; (3) the inter-carp interaction can be derived from the statistical analytics about large-scale observed data. The proposed mathematical model is mainly based on empirical inter-carp force field, whose effect is featured with repulsion, parallel orientation, attraction, out-of-perception zone, and blind. Based on above mathematical model, the aggregation behavior of carp is formulated and preliminary simulation results about the aggregation of small number of carps within simple environment are provided. Further experiment-based validation about the mathematical model will be made in our future work.

  10. Launch Site Computer Simulation and its Application to Processes

    NASA Technical Reports Server (NTRS)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  11. Computer Simulated Growth of Icosahedral Glass

    NASA Astrophysics Data System (ADS)

    Leino, Y. A. J.; Salomaa, M. M.

    1990-01-01

    One possible model for materials displaying classically forbidden symmetry properties (apart from perfect quasicrystals) is the icosahedral glass model. We simulate the random growth of two types of two-dimensional icosahedral glasses consisting of the Penrose tiles, First we restrict the growth with the arrow rules, then we let the structure develop totally freely. The diffraction patterns have a clear five-fold symmetry in both cases. The diffraction peak intensities do not differ, but shapes of the central peaks vary depending on whether the arrow rules are imposed or not. Finally, we show that the half-width of the central peak decreases when the size of the simulation increases until a finite disorder-limited value is achieved. This phenomenon is in agreement with the behaviour of physical quasicrystallites and in contradiction with perfect mathematical quasicrystals which have Bragg peaks of zero width.

  12. Computational Aerothermodynamic Simulation Issues on Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; White, Jeffery A.

    2004-01-01

    The synthesis of physical models for gas chemistry and turbulence from the structured grid codes LAURA and VULCAN into the unstructured grid code FUN3D is described. A directionally Symmetric, Total Variation Diminishing (STVD) algorithm and an entropy fix (eigenvalue limiter) keyed to local cell Reynolds number are introduced to improve solution quality for hypersonic aeroheating applications. A simple grid-adaptation procedure is incorporated within the flow solver. Simulations of flow over an ellipsoid (perfect gas, inviscid), Shuttle Orbiter (viscous, chemical nonequilibrium) and comparisons to the structured grid solvers LAURA (cylinder, Shuttle Orbiter) and VULCAN (flat plate) are presented to show current capabilities. The quality of heating in 3D stagnation regions is very sensitive to algorithm options in general, high aspect ratio tetrahedral elements complicate the simulation of high Reynolds number, viscous flow as compared to locally structured meshes aligned with the flow.

  13. Computer simulation of a geomagnetic substorm

    NASA Technical Reports Server (NTRS)

    Lyon, J. G.; Brecht, S. H.; Huba, J. D.; Fedder, J. A.; Palmadesso, P. J.

    1981-01-01

    A global two-dimensional simulation of a substormlike process occurring in earth's magnetosphere is presented. The results are consistent with an empirical substorm model - the neutral-line model. Specifically, the introduction of a southward interplanetary magnetic field forms an open magnetosphere. Subsequently, a substorm neutral line forms at about 15 earth radii or closer in the magnetotail, and plasma sheet thinning and plasma acceleration occur. Eventually the substorm neutral line moves tailward toward its presubstorm position.

  14. Computer simulation of surface and film processes

    NASA Technical Reports Server (NTRS)

    Tiller, W. A.

    1981-01-01

    A molecular dynamics technique based upon Lennard-Jones type pair interactions is used to investigate time-dependent as well as equilibrium properties. The case study deals with systems containing Si and O atoms. In this case a more involved potential energy function (PEF) is employed and the system is simulated via a Monte-Carlo procedure. This furnishes the equilibrium properties of the system at its interfaces and surfaces as well as in the bulk.

  15. Evaluation of a Computer Simulation in a Therapeutics Case Discussion.

    ERIC Educational Resources Information Center

    Kinkade, Raenel E.; And Others

    1995-01-01

    A computer program was used to simulate a case presentation in pharmacotherapeutics. Students (n=24) used their knowledge of the disease (glaucoma) and various topical agents on the computer program's formulary to "treat" the patient. Comparison of results with a control group found the method as effective as traditional case presentation on…

  16. Use of Computer Simulations in Microbial and Molecular Genetics.

    ERIC Educational Resources Information Center

    Wood, Peter

    1984-01-01

    Describes five computer programs: four simulations of genetic and physical mapping experiments and one interactive learning program on the genetic coding mechanism. The programs were originally written in BASIC for the VAX-11/750 V.3. mainframe computer and have been translated into Applesoft BASIC for Apple IIe microcomputers. (JN)

  17. COFLO: A Computer Aid for Teaching Ecological Simulation.

    ERIC Educational Resources Information Center

    Le vow, Roy B.

    A computer-assisted course was designed to provide students with an understanding of modeling and simulation techniques in quantitiative ecology. It deals with continuous systems and has two segments. One develops mathematical and computer tools, beginning with abstract systems and their relation to physical systems. Modeling principles are next…

  18. Coached, Interactive Computer Simulations: A New Technology for Training.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

  19. Frontiers in the Teaching of Physiology. Computer Literacy and Simulation.

    ERIC Educational Resources Information Center

    Tidball, Charles S., Ed.; Shelesnyak, M. C., Ed.

    Provided is a collection of papers on computer literacy and simulation originally published in The Physiology Teacher, supplemented by additional papers and a glossary of terms relevant to the field. The 12 papers are presented in five sections. An affirmation of conventional physiology laboratory exercises, coping with computer terminology, and…

  20. Generating dynamic simulations of movement using computed muscle control.

    PubMed

    Thelen, Darryl G; Anderson, Frank C; Delp, Scott L

    2003-03-01

    Computation of muscle excitation patterns that produce coordinated movements of muscle-actuated dynamic models is an important and challenging problem. Using dynamic optimization to compute excitation patterns comes at a large computational cost, which has limited the use of muscle-actuated simulations. This paper introduces a new algorithm, which we call computed muscle control, that uses static optimization along with feedforward and feedback controls to drive the kinematic trajectory of a musculoskeletal model toward a set of desired kinematics. We illustrate the algorithm by computing a set of muscle excitations that drive a 30-muscle, 3-degree-of-freedom model of pedaling to track measured pedaling kinematics and forces. Only 10 min of computer time were required to compute muscle excitations that reproduced the measured pedaling dynamics, which is over two orders of magnitude faster than conventional dynamic optimization techniques. Simulated kinematics were within 1 degrees of experimental values, simulated pedal forces were within one standard deviation of measured pedal forces for nearly all of the crank cycle, and computed muscle excitations were similar in timing to measured electromyographic patterns. The speed and accuracy of this new algorithm improves the feasibility of using detailed musculoskeletal models to simulate and analyze movement. PMID:12594980

  1. Teaching Macroeconomics with a Computer Simulation. Final Report.

    ERIC Educational Resources Information Center

    Dolbear, F. Trenery, Jr.

    The study of macroeconomics--the determination and control of aggregative variables such as gross national product, unemployment and inflation--may be facilitated by the use of a computer simulation policy game. An aggregative model of the economy was constructed and programed for a computer and (hypothetical) historical data were generated. The…

  2. The Use of Computer Simulations in High School Curricula.

    ERIC Educational Resources Information Center

    Visich, Marian, Jr.; Braun, Ludwig

    The Huntington Computer Project has developed 17 simulation games which can be used for instructional purposes in high schools. These games were designed to run on digital computers and to deal with material from either biology, physics, or social studies. Distribution was achieved through the Digital Equipment Corporation, which disseminated…

  3. A computer simulator for development of engineering system design methodologies

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  4. A New Boundary Condition for Computer Simulations of Interfacial Systems

    SciTech Connect

    Wong, Ka-Yiu; Pettitt, Bernard M.; Montgomery, B.

    2000-08-18

    A new boundary condition for computer simulations of interfacial systems is presented. The simulation box used in this boundary condition is the asymmetric unit of space group Pb, and it contains only one interface. Compared to the simulation box using common periodic boundary conditions which contains two interfaces, the number of particles in the simulation is reduced by half. This boundary condition was tested against common periodic boundary conditions in molecular dynamic simulations of liquid water interacting with hydroxylated silica surfaces. It yielded results essentially identical to periodic boundary condition and consumed less CPU time for comparable statistics.

  5. A new boundary condition for computer simulations of interfacial systems

    NASA Astrophysics Data System (ADS)

    Wong, Ka-Yiu; Pettitt, B. Montgomery

    2000-08-01

    A new boundary condition for computer simulations of interfacial systems is presented. The simulation box used in this boundary condition is the asymmetric unit of space group Pb, and it contains only one interface. Compared to the simulation box using common periodic boundary conditions which contains two interfaces, the number of particles in the simulation is reduced by half. This boundary condition was tested against common periodic boundary conditions in molecular dynamic simulations of liquid water interacting with hydroxylated silica surfaces. It yielded results essentially identical to periodic boundary condition and consumed less CPU time for comparable statistics.

  6. Developing Computer-Based Interactive Video Simulations on Questioning Strategies.

    ERIC Educational Resources Information Center

    Rogers, Randall; Rieff, Judith

    1989-01-01

    This article presents a rationale for development and implementation of computer based interactive videotape (CBIV) in preservice teacher education; identifies advantages of CBIV simulations over other practice exercises; describes economical production procedures; discusses implications and importance of these simulations; and makes…

  7. Computer Simulation of Incomplete-Data Interpretation Exercise.

    ERIC Educational Resources Information Center

    Robertson, Douglas Frederick

    1987-01-01

    Described is a computer simulation that was used to help general education students enrolled in a large introductory geology course. The purpose of the simulation is to learn to interpret incomplete data. Students design a plan to collect bathymetric data for an area of the ocean. Procedures used by the students and instructor are included.…

  8. Learner Perceptions of Realism and Magic in Computer Simulations.

    ERIC Educational Resources Information Center

    Hennessy, Sara; O'Shea, Tim

    1993-01-01

    Discusses the possible lack of credibility in educational interactive computer simulations. Topics addressed include "Shopping on Mars," a collaborative adventure game for arithmetic calculation that uses direct manipulation in the microworld; the Alternative Reality Kit, a graphical animated environment for creating interactive simulations; and…

  9. Preliminary Evaluation of a Computer Simulation of Long Cane Use.

    ERIC Educational Resources Information Center

    Chubon, Robert A.; Keith, Ashley D.

    1989-01-01

    Developed and evaluated long cane mobility computer simulation as visual rehabilitation training device and research tool in graduate students assigned to instruction (BI) (N=10) or enhanced instruction (EI) (N=9). Found higher percentage of EI students completed simulation task. Concluded that students registered positive understanding changes,…

  10. Enhancing Computer Science Education with a Wireless Intelligent Simulation Environment

    ERIC Educational Resources Information Center

    Cook, Diane J.; Huber, Manfred; Yerraballi, Ramesh; Holder, Lawrence B.

    2004-01-01

    The goal of this project is to develop a unique simulation environment that can be used to increase students' interest and expertise in Computer Science curriculum. Hands-on experience with physical or simulated equipment is an essential ingredient for learning, but many approaches to training develop a separate piece of equipment or software for…

  11. Plant Closings and Capital Flight: A Computer-Assisted Simulation.

    ERIC Educational Resources Information Center

    Warner, Stanley; Breitbart, Myrna M.

    1989-01-01

    A course at Hampshire College was designed to simulate the decision-making environment in which constituencies in a medium-sized city would respond to the closing and relocation of a major corporate plant. The project, constructed as a role simulation with a computer component, is described. (MLW)

  12. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  13. Design Model for Learner-Centered, Computer-Based Simulations.

    ERIC Educational Resources Information Center

    Hawley, Chandra L.; Duffy, Thomas M.

    This paper presents a model for designing computer-based simulation environments within a constructivist framework for the K-12 school setting. The following primary criteria for the development of simulations are proposed: (1) the problem needs to be authentic; (2) the cognitive demand in learning should be authentic; (3) scaffolding supports a…

  14. Effectiveness of an Endodontic Diagnosis Computer Simulation Program.

    ERIC Educational Resources Information Center

    Fouad, Ashraf F.; Burleson, Joseph A.

    1997-01-01

    Effectiveness of a computer simulation to teach endodontic diagnosis was assessed using three groups (n=34,32,24) of dental students. All were lectured on diagnosis, pathology, and radiographic interpretation. One group then used the simulation, another had a seminar on the same material, and the third group had no further instruction. Results…

  15. The Design, Development, and Evaluation of an Evaluative Computer Simulation.

    ERIC Educational Resources Information Center

    Ehrlich, Lisa R.

    This paper discusses evaluation design considerations for a computer based evaluation simulation developed at the University of Iowa College of Medicine in Cardiology to assess the diagnostic skills of primary care physicians and medical students. The simulation developed allows for the assessment of diagnostic skills of physicians in the…

  16. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    ERIC Educational Resources Information Center

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables as…

  17. Computer simulation program is adaptable to industrial processes

    NASA Technical Reports Server (NTRS)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  18. Investigating the Effectiveness of Computer Simulations for Chemistry Learning

    ERIC Educational Resources Information Center

    Plass, Jan L.; Milne, Catherine; Homer, Bruce D.; Schwartz, Ruth N.; Hayward, Elizabeth O.; Jordan, Trace; Verkuilen, Jay; Ng, Florrie; Wang, Yan; Barrientos, Juan

    2012-01-01

    Are well-designed computer simulations an effective tool to support student understanding of complex concepts in chemistry when integrated into high school science classrooms? We investigated scaling up the use of a sequence of simulations of kinetic molecular theory and associated topics of diffusion, gas laws, and phase change, which we designed…

  19. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    ERIC Educational Resources Information Center

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  20. Agent-based modeling of complex infrastructures

    SciTech Connect

    North, M. J.

    2001-06-01

    Complex Adaptive Systems (CAS) can be applied to investigate complex infrastructures and infrastructure interdependencies. The CAS model agents within the Spot Market Agent Research Tool (SMART) and Flexible Agent Simulation Toolkit (FAST) allow investigation of the electric power infrastructure, the natural gas infrastructure and their interdependencies.

  1. Computer Models Simulate Fine Particle Dispersion

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Through a NASA Seed Fund partnership with DEM Solutions Inc., of Lebanon, New Hampshire, scientists at Kennedy Space Center refined existing software to study the electrostatic phenomena of granular and bulk materials as they apply to planetary surfaces. The software, EDEM, allows users to import particles and obtain accurate representations of their shapes for modeling purposes, such as simulating bulk solids behavior, and was enhanced to be able to more accurately model fine, abrasive, cohesive particles. These new EDEM capabilities can be applied in many industries unrelated to space exploration and have been adopted by several prominent U.S. companies, including John Deere, Pfizer, and Procter & Gamble.

  2. Computer simulator for training operators of thermal cameras

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof; Krupski, Marcin

    2004-08-01

    A PC-based image generator SIMTERM developed for training operators of non-airborne military thermal imaging systems is presented in this paper. SIMTERM allows its users to generate images closely resembling thermal images of many military type targets at different scenarios obtained with the simulated thermal camera. High fidelity of simulation was achieved due to use of measurable parameters of thermal camera as input data. Two modified versions of this computer simulator developed for designers and test teams are presented, too.

  3. Computer simulation of vasectomy for wolf control

    USGS Publications Warehouse

    Haight, R.G.; Mech, L.D.

    1997-01-01

    Recovering gray wolf (Canis lupus) populations in the Lake Superior region of the United States are prompting state management agencies to consider strategies to control population growth. In addition to wolf removal, vasectomy has been proposed. To predict the population effects of different sterilization and removal strategies, we developed a simulation model of wolf dynamics using simple rules for demography and dispersal. Simulations suggested that the effects of vasectomy and removal in a disjunct population depend largely on the degree of annual immigration. With low immigration, periodic sterilization reduced pup production and resulted in lower rates of territory recolonization. Consequently, average pack size, number of packs, and population size were significantly less than those for an untreated population. Periodically removing a proportion of the population produced roughly the same trends as did sterilization; however, more than twice as many wolves had to be removed than sterilized. With high immigration, periodic sterilization reduced pup production but not territory recolonization and produced only moderate reductions in population size relative to an untreated population. Similar reductions in population size were obtained by periodically removing large numbers of wolves. Our analysis does not address the possible effects of vasectomy on larger wolf populations, but it suggests that the subject should be considered through modeling or field testing.

  4. Advanced computer graphic techniques for laser range finder (LRF) simulation

    NASA Astrophysics Data System (ADS)

    Bedkowski, Janusz; Jankowski, Stanislaw

    2008-11-01

    This paper show an advanced computer graphic techniques for laser range finder (LRF) simulation. The LRF is the common sensor for unmanned ground vehicle, autonomous mobile robot and security applications. The cost of the measurement system is extremely high, therefore the simulation tool is designed. The simulation gives an opportunity to execute algorithm such as the obstacle avoidance[1], slam for robot localization[2], detection of vegetation and water obstacles in surroundings of the robot chassis[3], LRF measurement in crowd of people[1]. The Axis Aligned Bounding Box (AABB) and alternative technique based on CUDA (NVIDIA Compute Unified Device Architecture) is presented.

  5. Computer Simulations of Supercooled Liquids and Glasses

    NASA Astrophysics Data System (ADS)

    Kob, Walter

    Glasses are materials that are ubiquitous in our daily life. We find them in such diverse items as window pans, optical fibers, computer chips, ceramics, all of which are oxide glasses, as well as in food, foams, polymers, gels, which are mainly of organic nature. Roughly speaking glasses are solid materials that have no translational or orientational order on the scale beyond O(10) diameters of the constituent particles (atoms, colloids, …) [1]. Note that these materials are not necessarily homogeneous since, e.g., alkali-glasses such as Na2O-SiO2 show (disordered!) structural features on the length scale of 6-10 Å (compare to the interatomic distance of 1-2 Å) and gels can have structural inhomogeneities that extend up to macroscopic length scales.

  6. Computer simulation of digital signal modulation techniques in satellite communications

    NASA Astrophysics Data System (ADS)

    Carlson, C. D.

    1985-09-01

    Tutorial on digital signal modulation techniques used in satellite communications is presented and includes computer simulation of those digital signal modulation techniques introduced. The purpose is to introduce digital signal modulation techniques and through the use of computer simulation, generate statistics which represent the characteristics of the FFT for the respective signal type. Further, an analysis of the statistics of the FFT's is conducted to determine if there is any relationship between the components of the FFT of the different signals. The statistic used to investigate this possible relationship is the F-distribution. The computer simulation is written and conducted in the FORTRAN programming language. A copy of the program, results of the simulation and the statistical analysis conducted are included in the appendices.

  7. Computational methods for coupling microstructural and micromechanical materials response simulations

    SciTech Connect

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.; FANG,HUEI ELIOT; RINTOUL,MARK DANIEL; VEDULA,VENKATA R.; GLASS,S. JILL; KNOROVSKY,GERALD A.; NEILSEN,MICHAEL K.; WELLMAN,GERALD W.; SULSKY,DEBORAH; SHEN,YU-LIN; SCHREYER,H. BUCK

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were applied to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.

  8. Computational Simulations and the Scientific Method

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Wood, Bill

    2005-01-01

    As scientific simulation software becomes more complicated, the scientific-software implementor's need for component tests from new model developers becomes more crucial. The community's ability to follow the basic premise of the Scientific Method requires independently repeatable experiments, and model innovators are in the best position to create these test fixtures. Scientific software developers also need to quickly judge the value of the new model, i.e., its cost-to-benefit ratio in terms of gains provided by the new model and implementation risks such as cost, time, and quality. This paper asks two questions. The first is whether other scientific software developers would find published component tests useful, and the second is whether model innovators think publishing test fixtures is a feasible approach.

  9. Computer simulations of adsorbed liquid crystal films

    NASA Astrophysics Data System (ADS)

    Wall, Greg D.; Cleaver, Douglas J.

    2003-01-01

    The structures adopted by adsorbed thin films of Gay-Berne particles in the presence of a coexisting vapour phase are investigated by molecular dynamics simulation. The films are adsorbed at a flat substrate which favours planar anchoring, whereas the nematic-vapour interface favours normal alignment. On cooling, a system with a high molecule-substrate interaction strength exhibits substrate-induced planar orientational ordering and considerable stratification is observed in the density profiles. In contrast, a system with weak molecule-substrate coupling adopts a director orientation orthogonal to the substrate plane, owing to the increased influence of the nematic-vapour interface. There are significant differences between the structures adopted at the two interfaces, in contrast with the predictions of density functional treatments of such systems.

  10. Computing abstraction hierarchies by numerical simulation

    SciTech Connect

    Bundy, A.; Giunchiglia, F.; Sebastiani, R.; Walsh, T.

    1996-12-31

    We present a novel method for building ABSTRIPS-style abstraction hierarchies in planning. The aim of this method is to minimize the amount of backtracking between abstraction levels. Previous approaches have determined the criticality of operator preconditions by reasoning about plans directly. Here, we adopt a simpler and faster approach where we use numerical simulation of the planning process. We demonstrate the theoretical advantages of our approach by identifying some simple properties lacking in previous approaches but possessed by our method. We demonstrate the empirical advantages of our approach by a set of four benchmark experiments using the ABTWEAK system. We compare the quality of the abstraction hierarchies generated with those built by the ALPINE and HIGHPOINT algorithms.

  11. Osmosis : a molecular dynamics computer simulation study

    NASA Astrophysics Data System (ADS)

    Lion, Thomas

    Osmosis is a phenomenon of critical importance in a variety of processes ranging from the transport of ions across cell membranes and the regulation of blood salt levels by the kidneys to the desalination of water and the production of clean energy using potential osmotic power plants. However, despite its importance and over one hundred years of study, there is an ongoing confusion concerning the nature of the microscopic dynamics of the solvent particles in their transfer across the membrane. In this thesis the microscopic dynamical processes underlying osmotic pressure and concentration gradients are investigated using molecular dynamics (MD) simulations. I first present a new derivation for the local pressure that can be used for determining osmotic pressure gradients. Using this result, the steady-state osmotic pressure is studied in a minimal model for an osmotic system and the steady-state density gradients are explained using a simple mechanistic hopping model for the solvent particles. The simulation setup is then modified, allowing us to explore the timescales involved in the relaxation dynamics of the system in the period preceding the steady state. Further consideration is also given to the relative roles of diffusive and non-diffusive solvent transport in this period. Finally, in a novel modification to the classic osmosis experiment, the solute particles are driven out-of-equilibrium by the input of energy. The effect of this modification on the osmotic pressure and the osmotic ow is studied and we find that active solute particles can cause reverse osmosis to occur. The possibility of defining a new "osmotic effective temperature" is also considered and compared to the results of diffusive and kinetic temperatures..

  12. The computer scene generation for star simulator hardware-in-the-loop simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Yu, Hong; Du, Huijie; Lei, Jie

    2011-08-01

    The star sensor simulation system is used to test the star sensor performance on the ground, which is designed for star identification and spacecraft attitude determination of the spacecraft. The computer star scene based on the astronomical star chat is generated for hardware-in-the-loop simulation of the star sensor simulation system using by OpenGL.

  13. Computation simulation of the nonlinear response of suspension bridges

    SciTech Connect

    McCallen, D.B.; Astaneh-Asl, A.

    1997-10-01

    Accurate computational simulation of the dynamic response of long- span bridges presents one of the greatest challenges facing the earthquake engineering community The size of these structures, in terms of physical dimensions and number of main load bearing members, makes computational simulation of transient response an arduous task. Discretization of a large bridge with general purpose finite element software often results in a computational model of such size that excessive computational effort is required for three dimensional nonlinear analyses. The aim of the current study was the development of efficient, computationally based methodologies for the nonlinear analysis of cable supported bridge systems which would allow accurate characterization of a bridge with a relatively small number of degrees of freedom. This work has lead to the development of a special purpose software program for the nonlinear analysis of cable supported bridges and the methodologies and software are described and illustrated in this paper.

  14. Coupled computational simulation and empirical research into the foraging system of Pharaoh's ant (Monomorium pharaonis).

    PubMed

    Jackson, Duncan; Holcombe, Mike; Ratnieks, Francis

    2004-01-01

    The Pharaoh's ant (Monomorium pharaonis), a significant pest in many human environments, is phenomenally successful at locating and exploiting available food resources. Several pheromones are utilized in the self-organized foraging of this ant but most aspects of the overall system are poorly characterised. Agent-based modelling of ants as individual complex X-machines facilitates study of the mechanisms underlying the emergence of trails and aids understanding of the process. Conducting simultaneous modelling, and simulation, alongside empirical biological studies is shown to drive the research by formulating hypotheses that must be tested before the model can be verified and extended. Integration of newly characterised behavioural processes into the overall model will enable testing of general theories giving insight into division of labour within insect societies. This study aims to establish a new paradigm in computational modelling applicable to all types of multi-agent biological systems, from tissues to animal societies, as a powerful tool to accelerate basic research. PMID:15351134

  15. Endogenizing geopolitical boundaries with agent-based modeling.

    PubMed

    Cederman, Lars-Erik

    2002-05-14

    Agent-based modeling promises to overcome the reification of actors. Whereas this common, but limiting, assumption makes a lot of sense during periods characterized by stable actor boundaries, other historical junctures, such as the end of the Cold War, exhibit far-reaching and swift transformations of actors' spatial and organizational existence. Moreover, because actors cannot be assumed to remain constant in the long run, analysis of macrohistorical processes virtually always requires "sociational" endogenization. This paper presents a series of computational models, implemented with the software package REPAST, which trace complex macrohistorical transformations of actors be they hierarchically organized as relational networks or as collections of symbolic categories. With respect to the former, dynamic networks featuring emergent compound actors with agent compartments represented in a spatial grid capture organizational domination of the territorial state. In addition, models of "tagged" social processes allows the analyst to show how democratic states predicate their behavior on categorical traits. Finally, categorical schemata that select out politically relevant cultural traits in ethnic landscapes formalize a constructivist notion of national identity in conformance with the qualitative literature on nationalism. This "finite-agent method", representing both states and nations as higher-level structures superimposed on a lower-level grid of primitive agents or cultural traits, avoids reification of agency. Furthermore, it opens the door to explicit analysis of entity processes, such as the integration and disintegration of actors as well as boundary transformations. PMID:12011409

  16. Endogenizing geopolitical boundaries with agent-based modeling

    PubMed Central

    Cederman, Lars-Erik

    2002-01-01

    Agent-based modeling promises to overcome the reification of actors. Whereas this common, but limiting, assumption makes a lot of sense during periods characterized by stable actor boundaries, other historical junctures, such as the end of the Cold War, exhibit far-reaching and swift transformations of actors' spatial and organizational existence. Moreover, because actors cannot be assumed to remain constant in the long run, analysis of macrohistorical processes virtually always requires “sociational” endogenization. This paper presents a series of computational models, implemented with the software package REPAST, which trace complex macrohistorical transformations of actors be they hierarchically organized as relational networks or as collections of symbolic categories. With respect to the former, dynamic networks featuring emergent compound actors with agent compartments represented in a spatial grid capture organizational domination of the territorial state. In addition, models of “tagged” social processes allows the analyst to show how democratic states predicate their behavior on categorical traits. Finally, categorical schemata that select out politically relevant cultural traits in ethnic landscapes formalize a constructivist notion of national identity in conformance with the qualitative literature on nationalism. This “finite-agent method”, representing both states and nations as higher-level structures superimposed on a lower-level grid of primitive agents or cultural traits, avoids reification of agency. Furthermore, it opens the door to explicit analysis of entity processes, such as the integration and disintegration of actors as well as boundary transformations. PMID:12011409

  17. Agent-based reasoning for distributed multi-INT analysis

    NASA Astrophysics Data System (ADS)

    Inchiosa, Mario E.; Parker, Miles T.; Perline, Richard

    2006-05-01

    Fully exploiting the intelligence community's exponentially growing data resources will require computational approaches differing radically from those currently available. Intelligence data is massive, distributed, and heterogeneous. Conventional approaches requiring highly structured and centralized data will not meet this challenge. We report on a new approach, Agent-Based Reasoning (ABR). In NIST evaluations, the use of ABR software tripled analysts' solution speed, doubled accuracy, and halved perceived difficulty. ABR makes use of populations of fine-grained, locally interacting agents that collectively reason about intelligence scenarios in a self-organizing, "bottom-up" process akin to those found in biological and other complex systems. Reproduction rules allow agents to make inferences from multi-INT data, while movement rules organize information and optimize reasoning. Complementary deterministic and stochastic agent behaviors enhance reasoning power and flexibility. Agent interaction via small-world networks - such as are found in nervous systems, social networks, and power distribution grids - dramatically increases the rate of discovering intelligence fragments that usefully connect to yield new inferences. Small-world networks also support the distributed processing necessary to address intelligence community data challenges. In addition, we have found that ABR pre-processing can boost the performance of commercial text clustering software. Finally, we have demonstrated interoperability with Knowledge Engineering systems and seen that reasoning across diverse data sources can be a rich source of inferences.

  18. Traffic simulations on parallel computers using domain decomposition techniques

    SciTech Connect

    Hanebutte, U.R.; Tentner, A.M.

    1995-12-31

    Large scale simulations of Intelligent Transportation Systems (ITS) can only be achieved by using the computing resources offered by parallel computing architectures. Domain decomposition techniques are proposed which allow the performance of traffic simulations with the standard simulation package TRAF-NETSIM on a 128 nodes IBM SPx parallel supercomputer as well as on a cluster of SUN workstations. Whilst this particular parallel implementation is based on NETSIM, a microscopic traffic simulation model, the presented strategy is applicable to a broad class of traffic simulations. An outer iteration loop must be introduced in order to converge to a global solution. A performance study that utilizes a scalable test network that consist of square-grids is presented, which addresses the performance penalty introduced by the additional iteration loop.

  19. Numerical simulation of polymer flows: A parallel computing approach

    SciTech Connect

    Aggarwal, R.; Keunings, R.; Roux, F.X.

    1993-12-31

    We present a parallel algorithm for the numerical simulation of viscoelastic fluids on distributed memory computers. The algorithm has been implemented within a general-purpose commercial finite element package used in polymer processing applications. Results obtained on the Intel iPSC/860 computer demonstrate high parallel efficiency in complex flow problems. However, since the computational load is unknown a priori, load balancing is a challenging issue. We have developed an adaptive allocation strategy which dynamically reallocates the work load to the processors based upon the history of the computational procedure. We compare the results obtained with the adaptive and static scheduling schemes.

  20. Large Scale Computer Simulation of Erthocyte Membranes

    NASA Astrophysics Data System (ADS)

    Harvey, Cameron; Revalee, Joel; Laradji, Mohamed

    2007-11-01

    The cell membrane is crucial to the life of the cell. Apart from partitioning the inner and outer environment of the cell, they also act as a support of complex and specialized molecular machinery, important for both the mechanical integrity of the cell, and its multitude of physiological functions. Due to its relative simplicity, the red blood cell has been a favorite experimental prototype for investigations of the structural and functional properties of the cell membrane. The erythrocyte membrane is a composite quasi two-dimensional structure composed essentially of a self-assembled fluid lipid bilayer and a polymerized protein meshwork, referred to as the cytoskeleton or membrane skeleton. In the case of the erythrocyte, the polymer meshwork is mainly composed of spectrin, anchored to the bilayer through specialized proteins. Using a coarse-grained model, recently developed by us, of self-assembled lipid membranes with implicit solvent and using soft-core potentials, we simulated large scale red-blood-cells bilayers with dimensions ˜ 10-1 μm^2, with explicit cytoskeleton. Our aim is to investigate the renormalization of the elastic properties of the bilayer due to the underlying spectrin meshwork.

  1. Measure of Landscape Heterogeneity by Agent-Based Methodology

    NASA Astrophysics Data System (ADS)

    Wirth, E.; Szabó, Gy.; Czinkóczky, A.

    2016-06-01

    With the rapid increase of the world's population, the efficient food production is one of the key factors of the human survival. Since biodiversity and heterogeneity is the basis of the sustainable agriculture, the authors tried to measure the heterogeneity of a chosen landscape. The EU farming and subsidizing policies (EEA, 2014) support landscape heterogeneity and diversity, nevertheless exact measurements and calculations apart from statistical parameters (standard deviation, mean), do not really exist. In the present paper the authors' goal is to find an objective, dynamic method that measures landscape heterogeneity. It is achieved with the so called agent-based modelling, where randomly dispatched dynamic scouts record the observed land cover parameters and sum up the features of a new type of land. During the simulation the agents collect a Monte Carlo integral as a diversity landscape potential which can be considered as the unit of the `greening' measure. As a final product of the ABM method, a landscape potential map is obtained that can serve as a tool for objective decision making to support agricultural diversity.

  2. Agent-based Transaction management for Mobile Multidatabase

    SciTech Connect

    Ongtang, Machigar; Hurson, Ali R.; Jiao, Yu; Potok, Thomas E

    2007-01-01

    The requirements to access and manipulate data across multiple heterogeneous existing databases and the proliferation of mobile technologies have propelled the development of mobile multidatabase system (MDBS). In that environment, transaction management is not a trivial task due to the technological constraints. Agent technology is an evolving research area, which has been applied to several application domains. This paper proposes an Agent-based Transaction Management for Mobile Multidatabase (AT3M) system. AT3M applies static and mobile agents to manage the transaction processing in mobile multidatabase system. It enables a fully distributed transaction management, accommodates mobility of the mobile clients, and allows global subtransactions to process in parallel. The proposed algorithm utilizes the hierarchical meta data structure of Summary Schema Model (SSM) which captures semantic information of data objects in the underlying local databases at different levels of abstractions. It is shown by simulation that AT3M suits well in mobile multidatabase environment and outperforms the existing V-Locking algorithm designed for the same environment in many aspects.

  3. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    PubMed

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. PMID:25622296

  4. Evaluating Water Demand Using Agent-Based Modeling

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.

    2004-12-01

    The supply and demand of water resources are functions of complex, inter-related systems including hydrology, climate, demographics, economics, and policy. To assess the safety and sustainability of water resources, planners often rely on complex numerical models that relate some or all of these systems using mathematical abstractions. The accuracy of these models relies on how well the abstractions capture the true nature of the systems interactions. Typically, these abstractions are based on analyses of observations and/or experiments that account only for the statistical mean behavior of each system. This limits the approach in two important ways: 1) It cannot capture cross-system disruptive events, such as major drought, significant policy change, or terrorist attack, and 2) it cannot resolve sub-system level responses. To overcome these limitations, we are developing an agent-based water resources model that includes the systems of hydrology, climate, demographics, economics, and policy, to examine water demand during normal and extraordinary conditions. Agent-based modeling (ABM) develops functional relationships between systems by modeling the interaction between individuals (agents), who behave according to a probabilistic set of rules. ABM is a "bottom-up" modeling approach in that it defines macro-system behavior by modeling the micro-behavior of individual agents. While each agent's behavior is often simple and predictable, the aggregate behavior of all agents in each system can be complex, unpredictable, and different than behaviors observed in mean-behavior models. Furthermore, the ABM approach creates a virtual laboratory where the effects of policy changes and/or extraordinary events can be simulated. Our model, which is based on the demographics and hydrology of the Middle Rio Grande Basin in the state of New Mexico, includes agent groups of residential, agricultural, and industrial users. Each agent within each group determines its water usage

  5. Agent Based Intelligence in a Tetrahedral Rover

    NASA Technical Reports Server (NTRS)

    Phelps, Peter; Truszkowski, Walt

    2007-01-01

    A tetrahedron is a 4-node 6-strut pyramid structure which is being used by the NASA - Goddard Space Flight Center as the basic building block for a new approach to robotic motion. The struts are extendable; it is by the sequence of activities: strut-extension, changing the center of gravity and falling that the tetrahedron "moves". Currently, strut-extension is handled by human remote control. There is an effort underway to make the movement of the tetrahedron autonomous, driven by an attempt to achieve a goal. The approach being taken is to associate an intelligent agent with each node. Thus, the autonomous tetrahedron is realized as a constrained multi-agent system, where the constraints arise from the fact that between any two agents there is an extendible strut. The hypothesis of this work is that, by proper composition of such automated tetrahedra, robotic structures of various levels of complexity can be developed which will support more complex dynamic motions. This is the basis of the new approach to robotic motion which is under investigation. A Java-based simulator for the single tetrahedron, realized as a constrained multi-agent system, has been developed and evaluated. This paper reports on this project and presents a discussion of the structure and dynamics of the simulator.

  6. New Pedagogies on Teaching Science with Computer Simulations

    NASA Astrophysics Data System (ADS)

    Khan, Samia

    2011-06-01

    Teaching science with computer simulations is a complex undertaking. This case study examines how an experienced science teacher taught chemistry using computer simulations and the impact of his teaching on his students. Classroom observations over 3 semesters, teacher interviews, and student surveys were collected. The data was analyzed for (1) patterns in teacher-student-computer interactions, and (2) the outcome of these interactions on student learning. Using Technological Pedagogical Content Knowledge (TPCK) as a theoretical framework, analysis of the data indicates that computer simulations were employed in a unique instructional cycle across 11 topics in the science curriculum and that several teacher-developed heuristics were important to guiding the pedagogical approach. The teacher followed a pattern of "generate-evaluate-modify" (GEM) to teach chemistry, and simulation technology (T) was integrated in every stage of GEM (or T-GEM). Analysis of the student survey suggested that engagement with T-GEM enhanced conceptual understanding of chemistry. The author postulates the affordances of computer simulations and suggests T-GEM and its heuristics as an effective and viable pedagogy for teaching science with technology.

  7. Computer simulation tests of optimized neutron powder diffractometer configurations

    NASA Astrophysics Data System (ADS)

    Cussen, L. D.; Lieutenant, K.

    2016-06-01

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  8. A computer-based simulator of the atmospheric turbulence

    NASA Astrophysics Data System (ADS)

    Konyaev, Petr A.

    2015-11-01

    Computer software for modeling the atmospheric turbulence is developed on the basis of a time-varying random medium simulation algorithm and a split-step Fourier transform method for solving a wave propagation equation. A judicious choice of the simulator parameters, like the velocity of the evolution and motion of the medium, turbulence spectrum and scales, enables different effects of a random medium on the optical wavefront to be simulated. The implementation of the simulation software is shown to be simple and efficient due to parallel programming functions from the MKL Intel ® Parallel Studio libraries.

  9. Executive Summary: Special Section on Credible Computational Fluid Dynamics Simulations

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.

    1998-01-01

    This summary presents the motivation for the Special Section on the credibility of computational fluid dynamics (CFD) simulations, its objective, its background and context, its content, and its major conclusions. Verification and validation (V&V) are the processes for establishing the credibility of CFD simulations. Validation assesses whether correct things are performed and verification assesses whether they are performed correctly. Various aspects of V&V are discussed. Progress is made in verification of simulation models. Considerable effort is still needed for developing a systematic validation method that can assess the credibility of simulated reality.

  10. Usage of a reconfigurable computer to simulate multiparticle systems

    NASA Astrophysics Data System (ADS)

    Fragner, Heinrich

    2007-03-01

    In this article we focus on the implementation of a Lattice Monte Carlo simulation for a generic pair potential within a reconfigurable computing platform. The approach presented was used to simulate a specific soft matter system. We found the performed simulations to be in excellent accordance with previous theoretical and simulation studies. By taking advantage of the shortened processing time, we were also able to find new micro- and macroscopic properties of this system. Furthermore we analyzed analytically the effects of the spatial discretization introduced by the Lattice Monte Carlo algorithm.

  11. Advanced ERS design using computer simulation

    SciTech Connect

    Melhem, G.A.

    1995-12-31

    There are two schools of thought regarding pressure relief design, shortcut/simplified methods and detailed methods. The shortcut/simplified methods are mostly applicable to non-reactive systems. These methods use direct scale-up techniques to obtain a vent size. Little useful information can be obtained for reaction data such as onset temperatures, activation energy, decompositon stoichiometry, etc. In addition, this approach does not readily provide the ability to perform what-if and sensitivity analysis or data that can be used for post-release mitigation design. The detailed approach advocates a more fundamental approach to pressure relief design, especially for reactive systems. First, the reaction chemistry is qualified using small scale experiments and then this data is coupled with fluid dynamics to design the emergency relief system. In addition to vent sizing information, this approach provides insights into process modification and refinement as well as the establishment of a safe operating envelope. This approach provides necessary flow data for vent containment design (if required), structural support, etc. This approach also allows the direct evaluation of design sensitivity to variables such as temperature, pressure, composition, fill level, etc. on vent sizing while the shortcut approach requires an additional experiment per what-if scenario. This approach meets DIERS technology requirements for two-phase flow and vapor/liquid disengagement and exceeds it in many key areas for reacting systems such as stoichiometry estimation for decomposition reactions, non-ideal solutions effects, continuing reactions in piping and vent containment systems, etc. This paper provides an overview of our proposed equation of state based modeling approach and its computer code implementation. Numerous examples and model validations are also described. 42 refs., 23 figs., 9 tabs.

  12. Agent-Based Modeling of Growth Processes

    ERIC Educational Resources Information Center

    Abraham, Ralph

    2014-01-01

    Growth processes abound in nature, and are frequently the target of modeling exercises in the sciences. In this article we illustrate an agent-based approach to modeling, in the case of a single example from the social sciences: bullying.

  13. Combining high performance simulation, data acquisition, and graphics display computers

    NASA Technical Reports Server (NTRS)

    Hickman, Robert J.

    1989-01-01

    Issues involved in the continuing development of an advanced simulation complex are discussed. This approach provides the capability to perform the majority of tests on advanced systems, non-destructively. The controlled test environments can be replicated to examine the response of the systems under test to alternative treatments of the system control design, or test the function and qualification of specific hardware. Field tests verify that the elements simulated in the laboratories are sufficient. The digital computer is hosted by a Digital Equipment Corp. MicroVAX computer with an Aptec Computer Systems Model 24 I/O computer performing the communication function. An Applied Dynamics International AD100 performs the high speed simulation computing and an Evans and Sutherland PS350 performs on-line graphics display. A Scientific Computer Systems SCS40 acts as a high performance FORTRAN program processor to support the complex, by generating numerous large files from programs coded in FORTRAN that are required for the real time processing. Four programming languages are involved in the process, FORTRAN, ADSIM, ADRIO, and STAPLE. FORTRAN is employed on the MicroVAX host to initialize and terminate the simulation runs on the system. The generation of the data files on the SCS40 also is performed with FORTRAN programs. ADSIM and ADIRO are used to program the processing elements of the AD100 and its IOCP processor. STAPLE is used to program the Aptec DIP and DIA processors.

  14. Modeling and Computer Simulation: Molecular Dynamics and Kinetic Monte Carlo

    SciTech Connect

    Wirth, B.D.; Caturla, M.J.; Diaz de la Rubia, T.

    2000-10-10

    Recent years have witnessed tremendous advances in the realistic multiscale simulation of complex physical phenomena, such as irradiation and aging effects of materials, made possible by the enormous progress achieved in computational physics for calculating reliable, yet tractable interatomic potentials and the vast improvements in computational power and parallel computing. As a result, computational materials science is emerging as an important complement to theory and experiment to provide fundamental materials science insight. This article describes the atomistic modeling techniques of molecular dynamics (MD) and kinetic Monte Carlo (KMC), and an example of their application to radiation damage production and accumulation in metals. It is important to note at the outset that the primary objective of atomistic computer simulation should be obtaining physical insight into atomic-level processes. Classical molecular dynamics is a powerful method for obtaining insight about the dynamics of physical processes that occur on relatively short time scales. Current computational capability allows treatment of atomic systems containing as many as 10{sup 9} atoms for times on the order of 100 ns (10{sup -7}s). The main limitation of classical MD simulation is the relatively short times accessible. Kinetic Monte Carlo provides the ability to reach macroscopic times by modeling diffusional processes and time-scales rather than individual atomic vibrations. Coupling MD and KMC has developed into a powerful, multiscale tool for the simulation of radiation damage in metals.

  15. Mesoscopic effects in an agent-based bargaining model in regular lattices.

    PubMed

    Poza, David J; Santos, José I; Galán, José M; López-Paredes, Adolfo

    2011-01-01

    The effect of spatial structure has been proved very relevant in repeated games. In this work we propose an agent based model where a fixed finite population of tagged agents play iteratively the Nash demand game in a regular lattice. The model extends the multiagent bargaining model by Axtell, Epstein and Young modifying the assumption of global interaction. Each agent is endowed with a memory and plays the best reply against the opponent's most frequent demand. We focus our analysis on the transient dynamics of the system, studying by computer simulation the set of states in which the system spends a considerable fraction of the time. The results show that all the possible persistent regimes in the global interaction model can also be observed in this spatial version. We also find that the mesoscopic properties of the interaction networks that the spatial distribution induces in the model have a significant impact on the diffusion of strategies, and can lead to new persistent regimes different from those found in previous research. In particular, community structure in the intratype interaction networks may cause that communities reach different persistent regimes as a consequence of the hindering diffusion effect of fluctuating agents at their borders. PMID:21408019

  16. Mesoscopic Effects in an Agent-Based Bargaining Model in Regular Lattices

    PubMed Central

    Poza, David J.; Santos, José I.; Galán, José M.; López-Paredes, Adolfo

    2011-01-01

    The effect of spatial structure has been proved very relevant in repeated games. In this work we propose an agent based model where a fixed finite population of tagged agents play iteratively the Nash demand game in a regular lattice. The model extends the multiagent bargaining model by Axtell, Epstein and Young [1] modifying the assumption of global interaction. Each agent is endowed with a memory and plays the best reply against the opponent's most frequent demand. We focus our analysis on the transient dynamics of the system, studying by computer simulation the set of states in which the system spends a considerable fraction of the time. The results show that all the possible persistent regimes in the global interaction model can also be observed in this spatial version. We also find that the mesoscopic properties of the interaction networks that the spatial distribution induces in the model have a significant impact on the diffusion of strategies, and can lead to new persistent regimes different from those found in previous research. In particular, community structure in the intratype interaction networks may cause that communities reach different persistent regimes as a consequence of the hindering diffusion effect of fluctuating agents at their borders. PMID:21408019

  17. Development and evaluation of liquid embolic agents based on liquid crystalline material of glyceryl monooleate.

    PubMed

    Du, Ling-Ran; Lu, Xiao-Jing; Guan, Hai-Tao; Yang, Yong-Jie; Gu, Meng-Jie; Zheng, Zhuo-Zhao; Lv, Tian-Shi; Yan, Zi-Guang; Song, Li; Zou, Ying-Hua; Fu, Nai-Qi; Qi, Xian-Rong; Fan, Tian-Yuan

    2014-08-25

    New type of liquid embolic agents based on a liquid crystalline material of glyceryl monooleate (GMO) was developed and evaluated in this study. Ternary phase diagram of GMO, water and ethanol was constructed and three isotropic liquids (ILs, GMO:ethanol:water=49:21:30, 60:20:20 and 72:18:10 (w/w/w)) were selected as potential liquid embolic agents, which could spontaneously form viscous gel cast when contacting with water or physiological fluid. The ILs exhibited excellent microcatheter deliverability due to low viscosity, and were proved to successfully block the saline flow when performed in a device to simulate embolization in vitro. The ILs also showed good cytocompatibility on L929 mouse fibroblast cell line. The embolization of ILs to rabbit kidneys was performed successfully under monitoring of digital subtraction angiography (DSA), and embolic degree was affected by the initial formulation composition and used volume. At 5th week after embolization, DSA and computed tomography (CT) confirmed the renal arteries embolized with IL did not recanalize in follow-up period, and an obvious atrophy of the embolized kidney was observed. Therefore, the GMO-based liquid embolic agents showed feasible and effective to embolize, and potential use in clinical interventional embolization therapy. PMID:24858389

  18. Computers vs. wind tunnels for aerodynamic flow simulations

    NASA Technical Reports Server (NTRS)

    Chapman, D. R.; Mark, H.; Pirtle, M. W.

    1975-01-01

    It is pointed out that in other fields of computational physics, such as ballistics, celestial mechanics, and neutronics, computations have already displaced experiments as the principal means of obtaining dynamic simulations. In the case of aerodynamic investigations, the complexity of the computational work involved in solving the Navier-Stokes equations is the reason that such investigations rely currently mainly on wind-tunnel testing. However, because of inherent limitations of the wind-tunnel approach and economic considerations, it appears that at some time in the future aerodynamic studies will chiefly rely on computational flow data provided by the computer. Taking into account projected development trends, it is estimated that computers with the required capabilities for a solution of the complete viscous, time-dependent Navier-Stokes equations will be available in the mid-1980s.

  19. A computer simulation of aircraft evacuation with fire

    NASA Technical Reports Server (NTRS)

    Middleton, V. E.

    1983-01-01

    A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

  20. Computer simulation of two-phase flow in nuclear reactors

    SciTech Connect

    Wulff, W.

    1992-09-01

    Two-phase flow models dominate the economic resource requirements for development and use of computer codes for analyzing thermohydraulic transients in nuclear power plants. Six principles are presented on mathematical modeling and selection of numerical methods, along with suggestions on programming and machine selection, all aimed at reducing the cost of analysis. Computer simulation is contrasted with traditional computer calculation. The advantages of run-time interactive access operation in a simulation environment are demonstrated. It is explained that the drift-flux model is better suited for two-phase flow analysis in nuclear reactors than the two-fluid model, because of the latter`s closure problem. The advantage of analytical over numerical integration is demonstrated. Modeling and programming techniques are presented which minimize the number of needed arithmetical and logical operations and thereby increase the simulation speed, while decreasing the cost.