Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2010-01-01
Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933
Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2011-07-01
Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.
Developing interprofessional health competencies in a virtual world
King, Sharla; Chodos, David; Stroulia, Eleni; Carbonaro, Mike; MacKenzie, Mark; Reid, Andrew; Torres, Lisa; Greidanus, Elaine
2012-01-01
Background Virtual worlds provide a promising means of delivering simulations for developing interprofessional health skills. However, developing and implementing a virtual world simulation is a challenging process, in part because of the novelty of virtual worlds as a simulation platform and also because of the degree of collaboration required among technical and subject experts. Thus, it can be difficult to ensure that the simulation is both technically satisfactory and educationally appropriate. Methods To address this challenge, we propose the use of de Freitas and Oliver's four-dimensional framework as a means of guiding the development process. We give an overview of the framework and describe how its principles can be applied to the development of virtual world simulations. Results We present two virtual world simulation pilot projects that adopted this approach, and describe our development experience in these projects. We directly connect this experience to the four-dimensional framework, thus validating the framework's applicability to the projects and to the context of virtual world simulations in general. Conclusions We present a series of recommendations for developing virtual world simulations for interprofessional health education. These recommendations are based on the four-dimensional framework and are also informed by our experience with the pilot projects. PMID:23195649
Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam
2014-08-01
Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.
Extending the FairRoot framework to allow for simulation and reconstruction of free streaming data
NASA Astrophysics Data System (ADS)
Al-Turany, M.; Klein, D.; Manafov, A.; Rybalchenko, A.; Uhlig, F.
2014-06-01
The FairRoot framework is the standard framework for simulation, reconstruction and data analysis for the FAIR experiments. The framework is designed to optimise the accessibility for beginners and developers, to be flexible and to cope with future developments. FairRoot enhances the synergy between the different physics experiments. As a first step toward simulation of free streaming data, the time based simulation was introduced to the framework. The next step is the event source simulation. This is achieved via a client server system. After digitization the so called "samplers" can be started, where sampler can read the data of the corresponding detector from the simulation files and make it available for the reconstruction clients. The system makes it possible to develop and validate the online reconstruction algorithms. In this work, the design and implementation of the new architecture and the communication layer will be described.
Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando
2017-01-01
Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.
A simulation framework for the CMS Track Trigger electronics
NASA Astrophysics Data System (ADS)
Amstutz, C.; Magazzù, G.; Weber, M.; Palla, F.
2015-03-01
A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.
Template-Based Geometric Simulation of Flexible Frameworks
Wells, Stephen A.; Sartbaeva, Asel
2012-01-01
Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers. PMID:28817055
Simulation Framework for Intelligent Transportation Systems
DOT National Transportation Integrated Search
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...
Argonne Simulation Framework for Intelligent Transportation Systems
DOT National Transportation Integrated Search
1996-01-01
A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distribu...
Games and Simulations in Online Learning: Research and Development Frameworks
ERIC Educational Resources Information Center
Gibson, David; Aldrich, Clark; Prensky, Marc
2007-01-01
Games and Simulations in Online Learning: Research and Development Frameworks examines the potential of games and simulations in online learning, and how the future could look as developers learn to use the emerging capabilities of the Semantic Web. It presents a general understanding of how the Semantic Web will impact education and how games and…
A Modular Simulation Framework for Assessing Swarm Search Models
2014-09-01
SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework
A Software Framework for Aircraft Simulation
NASA Technical Reports Server (NTRS)
Curlett, Brian P.
2008-01-01
The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.
Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 2; Appendices
NASA Technical Reports Server (NTRS)
Murri, Daniel G.
2010-01-01
The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The appendices to the original report are contained in this document.
Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results
NASA Technical Reports Server (NTRS)
Murri, Daniel G.
2011-01-01
The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.
Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis. Volume 1
NASA Technical Reports Server (NTRS)
Murri, Daniel G.
2010-01-01
The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II (POST2) simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL Systems Analysis (EDL-SA) team, that is conducting studies of the technologies and architectures that are required to enable higher mass robotic and human mission to Mars. The findings of the assessment are contained in this report.
Onyx-Advanced Aeropropulsion Simulation Framework Created
NASA Technical Reports Server (NTRS)
Reed, John A.
2001-01-01
The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2005-07-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A Framework to Design and Optimize Chemical Flooding Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2006-08-31
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2004-11-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
The Unified Behavior Framework for the Simulation of Autonomous Agents
2015-03-01
1980s, researchers have designed a variety of robot control architectures intending to imbue robots with some degree of autonomy. A recently developed ...Identification Friend or Foe viii THE UNIFIED BEHAVIOR FRAMEWORK FOR THE SIMULATION OF AUTONOMOUS AGENTS I. Introduction The development of autonomy has...room for research by utilizing methods like simulation and modeling that consume less time and fewer monetary resources. A recently developed reactive
Interpreting the NLN Jeffries Framework in the context of Nurse Educator preparation.
Young, Patricia K; Shellenbarger, Teresa
2012-08-01
The NLN Jeffries Framework describing simulation in nursing education has been used widely to guide construction of human patient simulation scenarios and serve as a theoretical framework for research on the use of simulation. This framework was developed with a focus on prelicensure nursing education. However, use of human patient simulation scenarios is also a way of providing practice experiences for graduate students learning the educator role. High-fidelity human patient simulation offers nurse educator faculty a unique opportunity to cultivate the practical knowledge of teaching in an interactive and dynamic environment. This article describes how the components of The NLN Jeffries Framework can help to guide simulation design for nurse educator preparation. Adapting the components of the framework-which include teacher, student, educational practices, design characteristics, and outcomes-helps to ensure that future faculty gain hands-on experience with nurse educator core competencies. Copyright 2012, SLACK Incorporated.
CyberMedVPS: visual programming for development of simulators.
Morais, Aline M; Machado, Liliane S
2011-01-01
Computer applications based on Virtual Reality (VR) has been outstanding in training and teaching in the medical filed due to their ability to simulate realistic in which users can practice skills and decision making in different situations. But was realized in these frameworks a hard interaction of non-programmers users. Based on this problematic will be shown the CyberMedVPS, a graphical module which implement Visual Programming concepts to solve an interaction trouble. Frameworks to develop such simulators are available but their use demands knowledge of programming. Based on this problematic will be shown the CyberMedVPS, a graphical module for the CyberMed framework, which implements Visual Programming concepts to allow the development of simulators by non-programmers professionals of the medical field.
Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank
2006-04-01
This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.
Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin
2017-01-01
There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811
Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin
2017-01-01
There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.
Cognitive simulators for medical education and training.
Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L
2009-08-01
Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.
Hierarchical control and performance evaluation of multi-vehicle autonomous systems
NASA Astrophysics Data System (ADS)
Balakirsky, Stephen; Scrapper, Chris; Messina, Elena
2005-05-01
This paper will describe how the Mobility Open Architecture Tools and Simulation (MOAST) framework can facilitate performance evaluations of RCS compliant multi-vehicle autonomous systems. This framework provides an environment that allows for simulated and real architectural components to function seamlessly together. By providing repeatable environmental conditions, this framework allows for the development of individual components as well as component performance metrics. MOAST is composed of high-fidelity and low-fidelity simulation systems, a detailed model of real-world terrain, actual hardware components, a central knowledge repository, and architectural glue to tie all of the components together. This paper will describe the framework"s components in detail and provide an example that illustrates how the framework can be utilized to develop and evaluate a single architectural component through the use of repeatable trials and experimentation that includes both virtual and real components functioning together
Introducing FNCS: Framework for Network Co-Simulation
None
2018-06-07
This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.
Introducing FNCS: Framework for Network Co-Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-10-23
This video provides a basic overview of the PNNL Future Power Grid Initiative-developed Framework for Network Co-Simulation (FNCS). It discusses the increasing amounts of data coming from the power grid, and the need for a tool like FNCS that brings together data, transmission and distribution simulators. Included is a description of the FNCS architecture, and the advantages this new open source tool can bring to grid research and development efforts.
Integrating software architectures for distributed simulations and simulation analysis communities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael
2005-10-01
The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context ofmore » the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.« less
ERIC Educational Resources Information Center
Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen
2013-01-01
This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…
Death of a Simulated Pediatric Patient: Toward a More Robust Theoretical Framework.
McBride, Mary E; Schinasi, Dana Aronson; Moga, Michael Alice; Tripathy, Shreepada; Calhoun, Aaron
2017-12-01
A theoretical framework was recently proposed that encapsulates learner responses to simulated death due to action or inaction in the pediatric context. This framework, however, was developed at an institution that allows simulated death and thus does not address the experience of those centers at which this technique is not used. To address this, we performed a parallel qualitative study with the intent of augmenting the initial framework. We conducted focus groups, using a constructivist grounded theory approach, using physicians and nurses who have experienced a simulated cardiac arrest. The participants were recruited via e-mail. Transcripts were analyzed by coders blinded to the original framework to generate a list of provisional themes that were iteratively refined. These themes were then compared with the themes from the original article and used to derive a consensus model that incorporated the most relevant features of each. Focus group data yielded 7 themes. Six were similar to those developed in the original framework. One important exception was noted; however, those learners not exposed to patient death due to action or inaction often felt that the mannequin's survival was artificial. This additional theme was incorporated into a revised framework. The original framework addresses most aspects of learner reactions to simulated death. Our work suggests that adding the theme pertaining to the lack of realism that can be perceived when the mannequin is unexpectedly saved results in a more robust theoretical framework transferable to centers that do not allow mannequin death.
A geostatistical extreme-value framework for fast simulation of natural hazard events
Stephenson, David B.
2016-01-01
We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768
A Framework for the Design of Computer-Assisted Simulation Training for Complex Police Situations
ERIC Educational Resources Information Center
Söderström, Tor; Åström, Jan; Anderson, Greg; Bowles, Ron
2014-01-01
Purpose: The purpose of this paper is to report progress concerning the design of a computer-assisted simulation training (CAST) platform for developing decision-making skills in police students. The overarching aim is to outline a theoretical framework for the design of CAST to facilitate police students' development of search techniques in…
Generic framework for mining cellular automata models on protein-folding simulations.
Diaz, N; Tischer, I
2016-05-13
Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.
Rapid Prototyping of an Aircraft Model in an Object-Oriented Simulation
NASA Technical Reports Server (NTRS)
Kenney, P. Sean
2003-01-01
A team was created to participate in the Mars Scout Opportunity. Trade studies determined that an aircraft provided the best opportunity to complete the science objectives of the team. A high fidelity six degree of freedom flight simulation was required to provide credible evidence that the aircraft design fulfilled mission objectives and to support the aircraft design process by providing performance evaluations. The team created the simulation using the Langley Standard Real-Time Simulation in C++ (LaSRS++) application framework. A rapid prototyping approach was necessary because the team had only three months to both develop the aircraft simulation model and evaluate aircraft performance as the design and mission parameters matured. The design of LaSRS++ enabled rapid-prototyping in several ways. First, the framework allowed component models to be designed, implemented, unit-tested, and integrated quickly. Next, the framework provides a highly reusable infrastructure that allowed developers to maximize code reuse while concentrating on aircraft and mission specific features. Finally, the framework reduces risk by providing reusable components that allow developers to build a quality product with a compressed testing cycle that relies heavily on unit testing of new components.
Unified Simulation and Analysis Framework for Deep Space Navigation Design
NASA Technical Reports Server (NTRS)
Anzalone, Evan; Chuang, Jason; Olsen, Carrie
2013-01-01
As the technology that enables advanced deep space autonomous navigation continues to develop and the requirements for such capability continues to grow, there is a clear need for a modular expandable simulation framework. This tool's purpose is to address multiple measurement and information sources in order to capture system capability. This is needed to analyze the capability of competing navigation systems as well as to develop system requirements, in order to determine its effect on the sizing of the integrated vehicle. The development for such a framework is built upon Model-Based Systems Engineering techniques to capture the architecture of the navigation system and possible state measurements and observations to feed into the simulation implementation structure. These models also allow a common environment for the capture of an increasingly complex operational architecture, involving multiple spacecraft, ground stations, and communication networks. In order to address these architectural developments, a framework of agent-based modules is implemented to capture the independent operations of individual spacecraft as well as the network interactions amongst spacecraft. This paper describes the development of this framework, and the modeling processes used to capture a deep space navigation system. Additionally, a sample implementation describing a concept of network-based navigation utilizing digitally transmitted data packets is described in detail. This developed package shows the capability of the modeling framework, including its modularity, analysis capabilities, and its unification back to the overall system requirements and definition.
Reusable Component Model Development Approach for Parallel and Distributed Simulation
Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng
2014-01-01
Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751
NASA Technical Reports Server (NTRS)
Fortenbaugh, R. L.
1980-01-01
Instructions for using Vertical Attitude Takeoff and Landing Aircraft Simulation (VATLAS), the digital simulation program for application to vertical attitude takeoff and landing (VATOL) aircraft developed for installation on the NASA Ames CDC 7600 computer system are described. The framework for VATLAS is the Off-Line Simulation (OLSIM) routine. The OLSIM routine provides a flexible framework and standardized modules which facilitate the development of off-line aircraft simulations. OLSIM runs under the control of VTOLTH, the main program, which calls the proper modules for executing user specified options. These options include trim, stability derivative calculation, time history generation, and various input-output options.
Computer Simulation of the VASIMR Engine
NASA Technical Reports Server (NTRS)
Garrison, David
2005-01-01
The goal of this project is to develop a magneto-hydrodynamic (MHD) computer code for simulation of the VASIMR engine. This code is designed be easy to modify and use. We achieve this using the Cactus framework, a system originally developed for research in numerical relativity. Since its release, Cactus has become an extremely powerful and flexible open source framework. The development of the code will be done in stages, starting with a basic fluid dynamic simulation and working towards a more complex MHD code. Once developed, this code can be used by students and researchers in order to further test and improve the VASIMR engine.
NASA Astrophysics Data System (ADS)
Konnik, Mikhail V.; Welsh, James
2012-09-01
Numerical simulators for adaptive optics systems have become an essential tool for the research and development of the future advanced astronomical instruments. However, growing software code of the numerical simulator makes it difficult to continue to support the code itself. The problem of adequate documentation of the astronomical software for adaptive optics simulators may complicate the development since the documentation must contain up-to-date schemes and mathematical descriptions implemented in the software code. Although most modern programming environments like MATLAB or Octave have in-built documentation abilities, they are often insufficient for the description of a typical adaptive optics simulator code. This paper describes a general cross-platform framework for the documentation of scientific software using open-source tools such as LATEX, mercurial, Doxygen, and Perl. Using the Perl script that translates M-files MATLAB comments into C-like, one can use Doxygen to generate and update the documentation for the scientific source code. The documentation generated by this framework contains the current code description with mathematical formulas, images, and bibliographical references. A detailed description of the framework components is presented as well as the guidelines for the framework deployment. Examples of the code documentation for the scripts and functions of a MATLAB-based adaptive optics simulator are provided.
NEVESIM: event-driven neural simulation framework with a Python interface.
Pecevski, Dejan; Kappel, David; Jonke, Zeno
2014-01-01
NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies.
NEVESIM: event-driven neural simulation framework with a Python interface
Pecevski, Dejan; Kappel, David; Jonke, Zeno
2014-01-01
NEVESIM is a software package for event-driven simulation of networks of spiking neurons with a fast simulation core in C++, and a scripting user interface in the Python programming language. It supports simulation of heterogeneous networks with different types of neurons and synapses, and can be easily extended by the user with new neuron and synapse types. To enable heterogeneous networks and extensibility, NEVESIM is designed to decouple the simulation logic of communicating events (spikes) between the neurons at a network level from the implementation of the internal dynamics of individual neurons. In this paper we will present the simulation framework of NEVESIM, its concepts and features, as well as some aspects of the object-oriented design approaches and simulation strategies that were utilized to efficiently implement the concepts and functionalities of the framework. We will also give an overview of the Python user interface, its basic commands and constructs, and also discuss the benefits of integrating NEVESIM with Python. One of the valuable capabilities of the simulator is to simulate exactly and efficiently networks of stochastic spiking neurons from the recently developed theoretical framework of neural sampling. This functionality was implemented as an extension on top of the basic NEVESIM framework. Altogether, the intended purpose of the NEVESIM framework is to provide a basis for further extensions that support simulation of various neural network models incorporating different neuron and synapse types that can potentially also use different simulation strategies. PMID:25177291
A Simulation and Modeling Framework for Space Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olivier, S S
This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less
Sawyer, Taylor; White, Marjorie; Zaveri, Pavan; Chang, Todd; Ades, Anne; French, Heather; Anderson, JoDee; Auerbach, Marc; Johnston, Lindsay; Kessler, David
2015-08-01
Acquisition of competency in procedural skills is a fundamental goal of medical training. In this Perspective, the authors propose an evidence-based pedagogical framework for procedural skill training. The framework was developed based on a review of the literature using a critical synthesis approach and builds on earlier models of procedural skill training in medicine. The authors begin by describing the fundamentals of procedural skill development. Then, a six-step pedagogical framework for procedural skills training is presented: Learn, See, Practice, Prove, Do, and Maintain. In this framework, procedural skill training begins with the learner acquiring requisite cognitive knowledge through didactic education (Learn) and observation of the procedure (See). The learner then progresses to the stage of psychomotor skill acquisition and is allowed to deliberately practice the procedure on a simulator (Practice). Simulation-based mastery learning is employed to allow the trainee to prove competency prior to performing the procedure on a patient (Prove). Once competency is demonstrated on a simulator, the trainee is allowed to perform the procedure on patients with direct supervision, until he or she can be entrusted to perform the procedure independently (Do). Maintenance of the skill is ensured through continued clinical practice, supplemented by simulation-based training as needed (Maintain). Evidence in support of each component of the framework is presented. Implementation of the proposed framework presents a paradigm shift in procedural skill training. However, the authors believe that adoption of the framework will improve procedural skill training and patient safety.
Building occupancy simulation and data assimilation using a graph-based agent-oriented model
NASA Astrophysics Data System (ADS)
Rai, Sanish; Hu, Xiaolin
2018-07-01
Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.
A framework for service enterprise workflow simulation with multi-agents cooperation
NASA Astrophysics Data System (ADS)
Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun
2013-11-01
Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.
The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)
NASA Astrophysics Data System (ADS)
Moradi, I.; Prive, N.; McCarty, W.; Errico, R. M.; Gelaro, R.
2017-12-01
This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.
The OSSE Framework at the NASA Global Modeling and Assimilation Office (GMAO)
NASA Technical Reports Server (NTRS)
Moradi, Isaac; Prive, Nikki; McCarty, Will; Errico, Ronald M.; Gelaro, Ron
2017-01-01
This abstract summarizes the OSSE framework developed at the Global Modeling and Assimilation Office at the National Aeronautics and Space Administration (NASA/GMAO). Some of the OSSE techniques developed at GMAO including simulation of realistic observations, e.g., adding errors to simulated observations, are now widely used by the community to evaluate the impact of new observations on the weather forecasts. This talk presents some of the recent progresses and challenges in simulating realistic observations, radiative transfer modeling support for the GMAO OSSE activities, assimilation of OSSE observations into data assimilation systems, and evaluating the impact of simulated observations on the forecast skills.
XIMPOL: a new x-ray polarimetry observation-simulation and analysis framework
NASA Astrophysics Data System (ADS)
Omodei, Nicola; Baldini, Luca; Pesce-Rollins, Melissa; di Lalla, Niccolò
2017-08-01
We present a new simulation framework, XIMPOL, based on the python programming language and the Scipy stack, specifically developed for X-ray polarimetric applications. XIMPOL is not tied to any specific mission or instrument design and is meant to produce fast and yet realistic observation-simulations, given as basic inputs: (i) an arbitrary source model including morphological, temporal, spectral and polarimetric information, and (ii) the response functions of the detector under study, i.e., the effective area, the energy dispersion, the point-spread function and the modulation factor. The format of the response files is OGIP compliant, and the framework has the capability of producing output files that can be directly fed into the standard visualization and analysis tools used by the X-ray community, including XSPEC which make it a useful tool not only for simulating physical systems, but also to develop and test end-to-end analysis chains.
Symphony: A Framework for Accurate and Holistic WSN Simulation
Riliskis, Laurynas; Osipov, Evgeny
2015-01-01
Research on wireless sensor networks has progressed rapidly over the last decade, and these technologies have been widely adopted for both industrial and domestic uses. Several operating systems have been developed, along with a multitude of network protocols for all layers of the communication stack. Industrial Wireless Sensor Network (WSN) systems must satisfy strict criteria and are typically more complex and larger in scale than domestic systems. Together with the non-deterministic behavior of network hardware in real settings, this greatly complicates the debugging and testing of WSN functionality. To facilitate the testing, validation, and debugging of large-scale WSN systems, we have developed a simulation framework that accurately reproduces the processes that occur inside real equipment, including both hardware- and software-induced delays. The core of the framework consists of a virtualized operating system and an emulated hardware platform that is integrated with the general purpose network simulator ns-3. Our framework enables the user to adjust the real code base as would be done in real deployments and also to test the boundary effects of different hardware components on the performance of distributed applications and protocols. Additionally we have developed a clock emulator with several different skew models and a component that handles sensory data feeds. The new framework should substantially shorten WSN application development cycles. PMID:25723144
An Object-Oriented Finite Element Framework for Multiphysics Phase Field Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael R Tonks; Derek R Gaston; Paul C Millett
2012-01-01
The phase field approach is a powerful and popular method for modeling microstructure evolution. In this work, advanced numerical tools are used to create a phase field framework that facilitates rapid model development. This framework, called MARMOT, is based on Idaho National Laboratory's finite element Multiphysics Object-Oriented Simulation Environment. In MARMOT, the system of phase field partial differential equations (PDEs) are solved simultaneously with PDEs describing additional physics, such as solid mechanics and heat conduction, using the Jacobian-Free Newton Krylov Method. An object-oriented architecture is created by taking advantage of commonalities in phase fields models to facilitate development of newmore » models with very little written code. In addition, MARMOT provides access to mesh and time step adaptivity, reducing the cost for performing simulations with large disparities in both spatial and temporal scales. In this work, phase separation simulations are used to show the numerical performance of MARMOT. Deformation-induced grain growth and void growth simulations are included to demonstrate the muliphysics capability.« less
A Computational Framework for Efficient Low Temperature Plasma Simulations
NASA Astrophysics Data System (ADS)
Verma, Abhishek Kumar; Venkattraman, Ayyaswamy
2016-10-01
Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David; Agarwal, Deborah A.; Sun, Xin
2011-09-01
The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, D.; Agarwal, D.; Sun, X.
2011-01-01
The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.
Development, Validation, and Application of OSSEs at NASA/GMAO
NASA Technical Reports Server (NTRS)
Errico, Ronald; Prive, Nikki
2015-01-01
During the past several years, NASA Goddard's Global Modeling and Assimilation Office (GMAO) has been developing a framework for conducting Observing System Simulation Experiments (OSSEs). The motivation and design of that framework will be described and a sample of validation results presented. Fundamentals issues will be highlighted, particularly the critical importance of appropriately simulating system errors. Some problems that have just arisen in the newest experimental system will also be mentioned.
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
LDRD project final report : hybrid AI/cognitive tactical behavior framework for LVC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Djordjevich, Donna D.; Xavier, Patrick Gordon; Brannon, Nathan Gregory
This Lab-Directed Research and Development (LDRD) sought to develop technology that enhances scenario construction speed, entity behavior robustness, and scalability in Live-Virtual-Constructive (LVC) simulation. We investigated issues in both simulation architecture and behavior modeling. We developed path-planning technology that improves the ability to express intent in the planning task while still permitting an efficient search algorithm. An LVC simulation demonstrated how this enables 'one-click' layout of squad tactical paths, as well as dynamic re-planning for simulated squads and for real and simulated mobile robots. We identified human response latencies that can be exploited in parallel/distributed architectures. We did an experimentalmore » study to determine where parallelization would be productive in Umbra-based force-on-force (FOF) simulations. We developed and implemented a data-driven simulation composition approach that solves entity class hierarchy issues and supports assurance of simulation fairness. Finally, we proposed a flexible framework to enable integration of multiple behavior modeling components that model working memory phenomena with different degrees of sophistication.« less
iTOUGH2: A multiphysics simulation-optimization framework for analyzing subsurface systems
NASA Astrophysics Data System (ADS)
Finsterle, S.; Commer, M.; Edmiston, J. K.; Jung, Y.; Kowalsky, M. B.; Pau, G. S. H.; Wainwright, H. M.; Zhang, Y.
2017-11-01
iTOUGH2 is a simulation-optimization framework for the TOUGH suite of nonisothermal multiphase flow models and related simulators of geophysical, geochemical, and geomechanical processes. After appropriate parameterization of subsurface structures and their properties, iTOUGH2 runs simulations for multiple parameter sets and analyzes the resulting output for parameter estimation through automatic model calibration, local and global sensitivity analyses, data-worth analyses, and uncertainty propagation analyses. Development of iTOUGH2 is driven by scientific challenges and user needs, with new capabilities continually added to both the forward simulator and the optimization framework. This review article provides a summary description of methods and features implemented in iTOUGH2, and discusses the usefulness and limitations of an integrated simulation-optimization workflow in support of the characterization and analysis of complex multiphysics subsurface systems.
Gas turbine system simulation: An object-oriented approach
NASA Technical Reports Server (NTRS)
Drummond, Colin K.; Follen, Gregory J.; Putt, Charles W.
1993-01-01
A prototype gas turbine engine simulation has been developed that offers a generalized framework for the simulation of engines subject to steady-state and transient operating conditions. The prototype is in preliminary form, but it successfully demonstrates the viability of an object-oriented approach for generalized simulation applications. Although object oriented programming languages are-relative to FORTRAN-somewhat austere, it is proposed that gas turbine simulations of an interdisciplinary nature will benefit significantly in terms of code reliability, maintainability, and manageability. This report elucidates specific gas turbine simulation obstacles that an object-oriented framework can overcome and describes the opportunity for interdisciplinary simulation that the approach offers.
NASA Astrophysics Data System (ADS)
Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang
2010-11-01
This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.
NASA Astrophysics Data System (ADS)
Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.
2014-12-01
The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments.
Rincon, J A; Poza-Lujan, Jose-Luis; Julian, V; Posadas-Yagüe, Juan-Luis; Carrascosa, C
2016-01-01
This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system.
Extending MAM5 Meta-Model and JaCalIV E Framework to Integrate Smart Devices from Real Environments
2016-01-01
This paper presents the extension of a meta-model (MAM5) and a framework based on the model (JaCalIVE) for developing intelligent virtual environments. The goal of this extension is to develop augmented mirror worlds that represent a real and virtual world coupled, so that the virtual world not only reflects the real one, but also complements it. A new component called a smart resource artifact, that enables modelling and developing devices to access the real physical world, and a human in the loop agent to place a human in the system have been included in the meta-model and framework. The proposed extension of MAM5 has been tested by simulating a light control system where agents can access both virtual and real sensor/actuators through the smart resources developed. The results show that the use of real environment interactive elements (smart resource artifacts) in agent-based simulations allows to minimize the error between simulated and real system. PMID:26926691
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.
DOT National Transportation Integrated Search
2004-12-01
An integrated framework for addressing container transportation issues in the Northeast US is developed and illustrated. The framework involves the extension of a spatial-economic coastal container port and related multimodal demand simulation model ...
Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah
Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a scaling study that compares instrumented ROSS simulations with their noninstrumented counterparts in order to determine the amount of perturbation when running at different simulation scales.« less
A framework to simulate small shallow inland water bodies in semi-arid regions
NASA Astrophysics Data System (ADS)
Abbasi, Ali; Ohene Annor, Frank; van de Giesen, Nick
2017-12-01
In this study, a framework for simulating the flow field and heat transfer processes in small shallow inland water bodies has been developed. As the dynamics and thermal structure of these water bodies are crucial in studying the quality of stored water , and in assessing the heat fluxes from their surfaces as well, the heat transfer and temperature simulations were modeled. The proposed model is able to simulate the full 3-D water flow and heat transfer in the water body by applying complex and time varying boundary conditions. In this model, the continuity, momentum and temperature equations together with the turbulence equations, which comprise the buoyancy effect, have been solved. This model is built on the Reynolds Averaged Navier Stokes (RANS) equations with the widely used Boussinesq approach to solve the turbulence issues of the flow field. Micrometeorological data were obtained from an Automatic Weather Station (AWS) installed on the site and combined with field bathymetric measurements for the model. In the framework developed, a simple, applicable and generalizable approach is proposed for preparing the geometry of small shallow water bodies using coarsely measured bathymetry. All parts of the framework are based on open-source tools, which is essential for developing countries.
Optimal Wastewater Loading under Conflicting Goals and Technology Limitations in a Riverine System.
Rafiee, Mojtaba; Lyon, Steve W; Zahraie, Banafsheh; Destouni, Georgia; Jaafarzadeh, Nemat
2017-03-01
This paper investigates a novel simulation-optimization (S-O) framework for identifying optimal treatment levels and treatment processes for multiple wastewater dischargers to rivers. A commonly used water quality simulation model, Qual2K, was linked to a Genetic Algorithm optimization model for exploration of relevant fuzzy objective-function formulations for addressing imprecision and conflicting goals of pollution control agencies and various dischargers. Results showed a dynamic flow dependence of optimal wastewater loading with good convergence to near global optimum. Explicit considerations of real-world technological limitations, which were developed here in a new S-O framework, led to better compromise solutions between conflicting goals than those identified within traditional S-O frameworks. The newly developed framework, in addition to being more technologically realistic, is also less complicated and converges on solutions more rapidly than traditional frameworks. This technique marks a significant step forward for development of holistic, riverscape-based approaches that balance the conflicting needs of the stakeholders.
Numerical modeling of the fracture process in a three-unit all-ceramic fixed partial denture.
Kou, Wen; Kou, Shaoquan; Liu, Hongyuan; Sjögren, Göran
2007-08-01
The main objectives were to examine the fracture mechanism and process of a ceramic fixed partial denture (FPD) framework under simulated mechanical loading using a recently developed numerical modeling code, the R-T(2D) code, and also to evaluate the suitability of R-T(2D) code as a tool for this purpose. Using the recently developed R-T(2D) code the fracture mechanism and process of a 3U yttria-tetragonal zirconia polycrystal ceramic (Y-TZP) FPD framework was simulated under static loading. In addition, the fracture pattern obtained using the numerical simulation was compared with the fracture pattern obtained in a previous laboratory test. The result revealed that the framework fracture pattern obtained using the numerical simulation agreed with that observed in a previous laboratory test. Quasi-photoelastic stress fringe pattern and acoustic emission showed that the fracture mechanism was tensile failure and that the crack started at the lower boundary of the framework. The fracture process could be followed both in step-by-step and step-in-step. Based on the findings in the current study, the R-T(2D) code seems suitable for use as a complement to other tests and clinical observations in studying stress distribution, fracture mechanism and fracture processes in ceramic FPD frameworks.
Physically Based Modeling and Simulation with Dynamic Spherical Volumetric Simplex Splines
Tan, Yunhao; Hua, Jing; Qin, Hong
2009-01-01
In this paper, we present a novel computational modeling and simulation framework based on dynamic spherical volumetric simplex splines. The framework can handle the modeling and simulation of genus-zero objects with real physical properties. In this framework, we first develop an accurate and efficient algorithm to reconstruct the high-fidelity digital model of a real-world object with spherical volumetric simplex splines which can represent with accuracy geometric, material, and other properties of the object simultaneously. With the tight coupling of Lagrangian mechanics, the dynamic volumetric simplex splines representing the object can accurately simulate its physical behavior because it can unify the geometric and material properties in the simulation. The visualization can be directly computed from the object’s geometric or physical representation based on the dynamic spherical volumetric simplex splines during simulation without interpolation or resampling. We have applied the framework for biomechanic simulation of brain deformations, such as brain shifting during the surgery and brain injury under blunt impact. We have compared our simulation results with the ground truth obtained through intra-operative magnetic resonance imaging and the real biomechanic experiments. The evaluations demonstrate the excellent performance of our new technique. PMID:20161636
Clinical simulation practise framework.
Khalili, Hossein
2015-02-01
Historically, simulation has mainly been used to teach students hands-on skills in a relatively safe environment. With changes in the patient population, professional regulations and clinical environments, clinical simulation practise (CSP) must assist students to integrate and apply their theoretical knowledge and skills with their critical thinking, clinical judgement, prioritisation, problem solving, decision making, and teamwork skills to provide holistic care and treatment to their patients. CSP holds great potential to derive a positive transformation in students' transition into the workplace, by associating and consolidating learning from classrooms to clinical settings, and creating bridges between theory and practice. For CSP to be successful in filling the gap, the design and management of the simulation is crucial. In this article a new framework called 'Clinical simulation practise framework: A knowledge to action strategy in health professional education' is being introduced that aims to assist educators and curriculum developers in designing and managing their simulations. This CSP framework theorises that simulation as an experiential educational tool could improve students' competence, confidence and collaboration in performing professional practice in real settings if the CSP provides the following three dimensions: (1) a safe, positive, reflective and fun simulated learning environment; (2) challenging, but realistic, and integrated simulated scenarios; and (3) interactive, inclusive, interprofessional patient-centred simulated practise. © 2015 John Wiley & Sons Ltd.
Frameworks for Assessing the Quality of Modeling and Simulation Capabilities
NASA Astrophysics Data System (ADS)
Rider, W. J.
2012-12-01
The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.
Framework for Development of Object-Oriented Software
NASA Technical Reports Server (NTRS)
Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan
2004-01-01
The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.
Fluid-structure interaction simulations of deformable structures with non-linear thin shell elements
NASA Astrophysics Data System (ADS)
Asgharzadeh, Hafez; Hedayat, Mohammadali; Borazjani, Iman; Scientific Computing; Biofluids Laboratory Team
2017-11-01
Large deformation of structures in a fluid is simulated using a strongly coupled partitioned fluid-structure interaction (FSI) approach which is stabilized with under-relaxation and the Aitken acceleration technique. The fluid is simulated using a recently developed implicit Newton-Krylov method with a novel analytical Jacobian. Structures are simulated using a triangular thin-shell finite element formulation, which considers only translational degrees of freedom. The thin-shell method is developed on the top of a previously implemented membrane finite element formulation. A sharp interface immersed boundary method is used to handle structures in the fluid domain. The developed FSI framework is validated against two three-dimensional experiments: (1) a flexible aquatic vegetation in the fluid and (2) a heaving flexible panel in fluid. Furthermore, the developed FSI framework is used to simulate tissue heart valves, which involve large deformations and non-linear material properties. This work was supported by American Heart Association (AHA) Grant 13SDG17220022 and the Center of Computational Research (CCR) of University at Buffalo.
Numerical Propulsion System Simulation Architecture
NASA Technical Reports Server (NTRS)
Naiman, Cynthia G.
2004-01-01
The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.
The LSST metrics analysis framework (MAF)
NASA Astrophysics Data System (ADS)
Jones, R. L.; Yoachim, Peter; Chandrasekharan, Srinivasan; Connolly, Andrew J.; Cook, Kem H.; Ivezic, Željko; Krughoff, K. S.; Petry, Catherine; Ridgway, Stephen T.
2014-07-01
We describe the Metrics Analysis Framework (MAF), an open-source python framework developed to provide a user-friendly, customizable, easily-extensible set of tools for analyzing data sets. MAF is part of the Large Synoptic Survey Telescope (LSST) Simulations effort. Its initial goal is to provide a tool to evaluate LSST Operations Simulation (OpSim) simulated surveys to help understand the effects of telescope scheduling on survey performance, however MAF can be applied to a much wider range of datasets. The building blocks of the framework are Metrics (algorithms to analyze a given quantity of data), Slicers (subdividing the overall data set into smaller data slices as relevant for each Metric), and Database classes (to access the dataset and read data into memory). We describe how these building blocks work together, and provide an example of using MAF to evaluate different dithering strategies. We also outline how users can write their own custom Metrics and use these within the framework.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast
Pang, Wei; Coghill, George M.
2015-01-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377
NASA Astrophysics Data System (ADS)
Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.
2009-12-01
Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.
NASA Astrophysics Data System (ADS)
Hamid, A. H. A.; Rozan, M. Z. A.; Deris, S.; Ibrahim, R.; Abdullah, W. S. W.; Rahman, A. A.; Yunus, M. N. M.
2016-01-01
The evolution of current Radiation and Nuclear Emergency Planning Framework (RANEPF) simulator emphasizes on the human factors to be analyzed and interpreted according to the stakeholder's tacit and explicit knowledge. These human factor criteria are analyzed and interpreted according to the "sense making theory" and Disaster Emergency Response Management Information System (DERMIS) design premises. These criteria are corroborated by the statistical criteria. In recent findings, there were no differences of distributions among the stakeholders according to gender and organizational expertise. These criteria are incrementally accepted and agreed the research elements indicated in the respective emergency planning frameworks and simulator (i.e. 78.18 to 84.32, p-value <0.05). This paper suggested these human factors criteria in the associated analyses and theoretical perspectives to be further acomodated in the future simulator development. This development is in conjunction with the proposed hypothesis building of the process factors and responses diagram. We proposed that future work which implies the additional functionality of the simulator, as strategized, condensed and concise, comprehensive public disaster preparedness and intervention guidelines, to be a useful and efficient computer simulation.
Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.
2014-01-01
The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083
Simulating human behavior for national security human interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.
2007-01-01
This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less
Moore, Jason H; Amos, Ryan; Kiralis, Jeff; Andrews, Peter C
2015-01-01
Simulation plays an essential role in the development of new computational and statistical methods for the genetic analysis of complex traits. Most simulations start with a statistical model using methods such as linear or logistic regression that specify the relationship between genotype and phenotype. This is appealing due to its simplicity and because these statistical methods are commonly used in genetic analysis. It is our working hypothesis that simulations need to move beyond simple statistical models to more realistically represent the biological complexity of genetic architecture. The goal of the present study was to develop a prototype genotype–phenotype simulation method and software that are capable of simulating complex genetic effects within the context of a hierarchical biology-based framework. Specifically, our goal is to simulate multilocus epistasis or gene–gene interaction where the genetic variants are organized within the framework of one or more genes, their regulatory regions and other regulatory loci. We introduce here the Heuristic Identification of Biological Architectures for simulating Complex Hierarchical Interactions (HIBACHI) method and prototype software for simulating data in this manner. This approach combines a biological hierarchy, a flexible mathematical framework, a liability threshold model for defining disease endpoints, and a heuristic search strategy for identifying high-order epistatic models of disease susceptibility. We provide several simulation examples using genetic models exhibiting independent main effects and three-way epistatic effects. PMID:25395175
James B. McCarter; Sean Healey
2015-01-01
The Forest Carbon Management Framework (ForCaMF) integrates Forest Inventory and Analysis (FIA) plot inventory data, disturbance histories, and carbon response trajectories to develop estimates of disturbance and management effects on carbon pools for the National Forest System. All appropriate FIA inventory plots are simulated using the Forest Vegetation Simulator (...
Benjamin Wang; Robert E. Manning; Steven R. Lawson; William A. Valliere
2001-01-01
Recent research and management experience has led to several frameworks for defining and managing carrying capacity of national parks and related areas. These frameworks rely on monitoring indicator variables to ensure that standards of quality are maintained. The objective of this study was to develop a computer simulation model to estimate the relationships between...
Interventional radiology virtual simulator for liver biopsy.
Villard, P F; Vidal, F P; ap Cenydd, L; Holbrey, R; Pisharody, S; Johnson, S; Bulpitt, A; John, N W; Bello, F; Gould, D
2014-03-01
Training in Interventional Radiology currently uses the apprenticeship model, where clinical and technical skills of invasive procedures are learnt during practice in patients. This apprenticeship training method is increasingly limited by regulatory restrictions on working hours, concerns over patient risk through trainees' inexperience and the variable exposure to case mix and emergencies during training. To address this, we have developed a computer-based simulation of visceral needle puncture procedures. A real-time framework has been built that includes: segmentation, physically based modelling, haptics rendering, pseudo-ultrasound generation and the concept of a physical mannequin. It is the result of a close collaboration between different universities, involving computer scientists, clinicians, clinical engineers and occupational psychologists. The technical implementation of the framework is a robust and real-time simulation environment combining a physical platform and an immersive computerized virtual environment. The face, content and construct validation have been previously assessed, showing the reliability and effectiveness of this framework, as well as its potential for teaching visceral needle puncture. A simulator for ultrasound-guided liver biopsy has been developed. It includes functionalities and metrics extracted from cognitive task analysis. This framework can be useful during training, particularly given the known difficulties in gaining significant practice of core skills in patients.
Design of a framework for modeling, integration and simulation of physiological models.
Erson, E Zeynep; Cavuşoğlu, M Cenk
2012-09-01
Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta
2018-05-01
Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.
NASA Astrophysics Data System (ADS)
Destyanto, A. R.; Putri, O. A.; Hidayatno, A.
2017-11-01
Due to the advantages that serious simulation game offered, many areas of studies, including energy, have used serious simulation games as their instruments. However, serious simulation games in the field of energy transition still have few attentions. In this study, serious simulation game is developed and tested as the activity of public education about energy transition which is a conversion from oil to natural gas program. The aim of the game development is to create understanding and awareness about the importance of energy transition for society in accelerating the process of energy transition in Indonesia since 1987 the energy transition program has not achieved the conversion target yet due to the lack of education about energy transition for society. Developed as a digital serious simulation game following the framework of integrated game design, the Transergy game has been tested to 15 users and then analysed. The result of verification and validation of the game shows that Transergy gives significance to the users for understanding and triggering the needs of oil to natural gas conversion.
Real time flight simulation methodology
NASA Technical Reports Server (NTRS)
Parrish, E. A.; Cook, G.; Mcvey, E. S.
1977-01-01
Substitutional methods for digitization, input signal-dependent integrator approximations, and digital autopilot design were developed. The software framework of a simulator design package is described. Included are subroutines for iterative designs of simulation models and a rudimentary graphics package.
A framework of knowledge creation processes in participatory simulation of hospital work systems.
Andersen, Simone Nyholm; Broberg, Ole
2017-04-01
Participatory simulation (PS) is a method to involve workers in simulating and designing their own future work system. Existing PS studies have focused on analysing the outcome, and minimal attention has been devoted to the process of creating this outcome. In order to study this process, we suggest applying a knowledge creation perspective. The aim of this study was to develop a framework describing the process of how ergonomics knowledge is created in PS. Video recordings from three projects applying PS of hospital work systems constituted the foundation of process mining analysis. The analysis resulted in a framework revealing the sources of ergonomics knowledge creation as sequential relationships between the activities of simulation participants sharing work experiences; experimenting with scenarios; and reflecting on ergonomics consequences. We argue that this framework reveals the hidden steps of PS that are essential when planning and facilitating PS that aims at designing work systems. Practitioner Summary: When facilitating participatory simulation (PS) in work system design, achieving an understanding of the PS process is essential. By applying a knowledge creation perspective and process mining, we investigated the knowledge-creating activities constituting the PS process. The analysis resulted in a framework of the knowledge-creating process in PS.
Peter, Silvia; Modregger, Peter; Fix, Michael K.; Volken, Werner; Frei, Daniel; Manser, Peter; Stampanoni, Marco
2014-01-01
Phase-sensitive X-ray imaging shows a high sensitivity towards electron density variations, making it well suited for imaging of soft tissue matter. However, there are still open questions about the details of the image formation process. Here, a framework for numerical simulations of phase-sensitive X-ray imaging is presented, which takes both particle- and wave-like properties of X-rays into consideration. A split approach is presented where we combine a Monte Carlo method (MC) based sample part with a wave optics simulation based propagation part, leading to a framework that takes both particle- and wave-like properties into account. The framework can be adapted to different phase-sensitive imaging methods and has been validated through comparisons with experiments for grating interferometry and propagation-based imaging. The validation of the framework shows that the combination of wave optics and MC has been successfully implemented and yields good agreement between measurements and simulations. This demonstrates that the physical processes relevant for developing a deeper understanding of scattering in the context of phase-sensitive imaging are modelled in a sufficiently accurate manner. The framework can be used for the simulation of phase-sensitive X-ray imaging, for instance for the simulation of grating interferometry or propagation-based imaging. PMID:24763652
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energymore » advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.« less
NASA Technical Reports Server (NTRS)
Afjeh, Abdollah A.; Reed, John A.
2003-01-01
This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.
Abdelgaied, A; Fisher, J; Jennings, L M
2018-02-01
A more robust pre-clinical wear simulation framework is required in order to simulate wider and higher ranges of activities, observed in different patient populations such as younger more active patients. Such a framework will help to understand and address the reported higher failure rates for younger and more active patients (National_Joint_Registry, 2016). The current study has developed and validated a comprehensive combined experimental and computational framework for pre-clinical wear simulation of total knee replacements (TKR). The input mechanical (elastic modulus and Poisson's ratio) and wear parameters of the moderately cross-linked ultra-high molecular weight polyethylene (UHMWPE) bearing material were independently measured from experimental studies under realistic test conditions, similar to the loading conditions found in the total knee replacements. The wear predictions from the computational wear simulation were validated against the direct experimental wear measurements for size 3 Sigma curved total knee replacements (DePuy, UK) in an independent experimental wear simulation study under three different daily activities; walking, deep squat, and stairs ascending kinematic conditions. The measured compressive mechanical properties of the moderately cross-linked UHMWPE material were more than 20% lower than that reported in the literature under tensile test conditions. The pin-on-plate wear coefficient of moderately cross-linked UHMWPE was significantly dependant of the contact stress and the degree of cross-shear at the articulating surfaces. The computational wear predictions for the TKR from the current framework were consistent and in a good agreement with the independent full TKR experimental wear simulation measurements, with 0.94 coefficient of determination of the framework. In addition, the comprehensive combined experimental and computational framework was able to explain the complex experimental wear trends from the three different daily activities investigated. Therefore, such a framework can be adopted as a pre-clinical simulation approach to optimise different designs, materials, as well as patient's specific total knee replacements for a range of activities. Copyright © 2017. Published by Elsevier Ltd.
Metascalable molecular dynamics simulation of nano-mechano-chemistry
NASA Astrophysics Data System (ADS)
Shimojo, F.; Kalia, R. K.; Nakano, A.; Nomura, K.; Vashishta, P.
2008-07-01
We have developed a metascalable (or 'design once, scale on new architectures') parallel application-development framework for first-principles based simulations of nano-mechano-chemical processes on emerging petaflops architectures based on spatiotemporal data locality principles. The framework consists of (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms, (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these scalable algorithms onto hardware. The EDC-STEP-HCD framework exposes and expresses maximal concurrency and data locality, thereby achieving parallel efficiency as high as 0.99 for 1.59-billion-atom reactive force field molecular dynamics (MD) and 17.7-million-atom (1.56 trillion electronic degrees of freedom) quantum mechanical (QM) MD in the framework of the density functional theory (DFT) on adaptive multigrids, in addition to 201-billion-atom nonreactive MD, on 196 608 IBM BlueGene/L processors. We have also used the framework for automated execution of adaptive hybrid DFT/MD simulation on a grid of six supercomputers in the US and Japan, in which the number of processors changed dynamically on demand and tasks were migrated according to unexpected faults. The paper presents the application of the framework to the study of nanoenergetic materials: (1) combustion of an Al/Fe2O3 thermite and (2) shock initiation and reactive nanojets at a void in an energetic crystal.
Prototype software model for designing intruder detection systems with simulation
NASA Astrophysics Data System (ADS)
Smith, Jeffrey S.; Peters, Brett A.; Curry, James C.; Gupta, Dinesh
1998-08-01
This article explores using discrete-event simulation for the design and control of defence oriented fixed-sensor- based detection system in a facility housing items of significant interest to enemy forces. The key issues discussed include software development, simulation-based optimization within a modeling framework, and the expansion of the framework to create real-time control tools and training simulations. The software discussed in this article is a flexible simulation environment where the data for the simulation are stored in an external database and the simulation logic is being implemented using a commercial simulation package. The simulation assesses the overall security level of a building against various intruder scenarios. A series of simulation runs with different inputs can determine the change in security level with changes in the sensor configuration, building layout, and intruder/guard strategies. In addition, the simulation model developed for the design stage of the project can be modified to produce a control tool for the testing, training, and real-time control of systems with humans and sensor hardware in the loop.
Lance A. Vickers; David R. Larsen; John M. Kabrick; Daniel C. Dey; Benjamin O. Knapp
2016-01-01
Predicting the effects of silvicultural choices on tree regeneration has traditionally been difficult with the tools currently available to foresters. In an effort to improve this, we have developed a simulation framework based on hypotheses of stand dynamics for several species found in the Missouri Ozarks. This framework includes separate modules for establishment,...
Simulating Poverty and Inequality Dynamics in Developing Countries
ERIC Educational Resources Information Center
Ansoms, An; Geenen, Sara
2012-01-01
This article considers how the simulation game of DEVELOPMENT MONOPOLY provides insight into poverty and inequality dynamics in a development context. It first discusses how the game is rooted in theoretical and conceptual frameworks on poverty and inequality. Subsequently, it reflects on selected playing experiences, with special focus on the…
A multi-GPU real-time dose simulation software framework for lung radiotherapy.
Santhanam, A P; Min, Y; Neelakkantan, H; Papp, N; Meeks, S L; Kupelian, P A
2012-09-01
Medical simulation frameworks facilitate both the preoperative and postoperative analysis of the patient's pathophysical condition. Of particular importance is the simulation of radiation dose delivery for real-time radiotherapy monitoring and retrospective analyses of the patient's treatment. In this paper, a software framework tailored for the development of simulation-based real-time radiation dose monitoring medical applications is discussed. A multi-GPU-based computational framework coupled with inter-process communication methods is introduced for simulating the radiation dose delivery on a deformable 3D volumetric lung model and its real-time visualization. The model deformation and the corresponding dose calculation are allocated among the GPUs in a task-specific manner and is performed in a pipelined manner. Radiation dose calculations are computed on two different GPU hardware architectures. The integration of this computational framework with a front-end software layer and back-end patient database repository is also discussed. Real-time simulation of the dose delivered is achieved at once every 120 ms using the proposed framework. With a linear increase in the number of GPU cores, the computational time of the simulation was linearly decreased. The inter-process communication time also improved with an increase in the hardware memory. Variations in the delivered dose and computational speedup for variations in the data dimensions are investigated using D70 and D90 as well as gEUD as metrics for a set of 14 patients. Computational speed-up increased with an increase in the beam dimensions when compared with a CPU-based commercial software while the error in the dose calculation was <1%. Our analyses show that the framework applied to deformable lung model-based radiotherapy is an effective tool for performing both real-time and retrospective analyses.
DOT National Transportation Integrated Search
2002-08-01
Building upon the conceptual framework developed during our year one research, a container port and multimodal transportation demand simulation model is applied. The model selects the least-cost (vessel-port-rail-truck) route from sources to markets,...
Efficient evaluation of wireless real-time control networks.
Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon
2015-02-11
In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.
Pang, Wei; Coghill, George M
2015-05-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Developing a Problem-Based Learning Simulation: An Economics Unit on Trade
ERIC Educational Resources Information Center
Maxwell, Nan L.; Mergendoller, John R.; Bellisimo, Yolanda
2004-01-01
This article argues that the merger of simulations and problem-based learning (PBL) can enhance both active-learning strategies. Simulations benefit by using a PBL framework to promote student-directed learning and problem-solving skills to explain a simulated dilemma with multiple solutions. PBL benefits because simulations structure the…
Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne
2010-04-01
Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.
A Solution Framework for Environmental Characterization Problems
This paper describes experiences developing a grid-enabled framework for solving environmental inverse problems. The solution approach taken here couples environmental simulation models with global search methods and requires readily available computational resources of the grid ...
An open-source job management framework for parameter-space exploration: OACIS
NASA Astrophysics Data System (ADS)
Murase, Y.; Uchitane, T.; Ito, N.
2017-11-01
We present an open-source software framework for parameter-space exporation, named OACIS, which is useful to manage vast amount of simulation jobs and results in a systematic way. Recent development of high-performance computers enabled us to explore parameter spaces comprehensively, however, in such cases, manual management of the workflow is practically impossible. OACIS is developed aiming at reducing the cost of these repetitive tasks when conducting simulations by automating job submissions and data management. In this article, an overview of OACIS as well as a getting started guide are presented.
A software framework for pipelined arithmetic algorithms in field programmable gate arrays
NASA Astrophysics Data System (ADS)
Kim, J. B.; Won, E.
2018-03-01
Pipelined algorithms implemented in field programmable gate arrays are extensively used for hardware triggers in the modern experimental high energy physics field and the complexity of such algorithms increases rapidly. For development of such hardware triggers, algorithms are developed in C++, ported to hardware description language for synthesizing firmware, and then ported back to C++ for simulating the firmware response down to the single bit level. We present a C++ software framework which automatically simulates and generates hardware description language code for pipelined arithmetic algorithms.
NASA Technical Reports Server (NTRS)
Plitau, Denis; Prasad, Narasimha S.
2012-01-01
The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.
Next Generation Simulation Framework for Robotic and Human Space Missions
NASA Technical Reports Server (NTRS)
Cameron, Jonathan M.; Balaram, J.; Jain, Abhinandan; Kuo, Calvin; Lim, Christopher; Myint, Steven
2012-01-01
The Dartslab team at NASA's Jet Propulsion Laboratory (JPL) has a long history of developing physics-based simulations based on the Darts/Dshell simulation framework that have been used to simulate many planetary robotic missions, such as the Cassini spacecraft and the rovers that are currently driving on Mars. Recent collaboration efforts between the Dartslab team at JPL and the Mission Operations Directorate (MOD) at NASA Johnson Space Center (JSC) have led to significant enhancements to the Dartslab DSENDS (Dynamics Simulator for Entry, Descent and Surface landing) software framework. The new version of DSENDS is now being used for new planetary mission simulations at JPL. JSC is using DSENDS as the foundation for a suite of software known as COMPASS (Core Operations, Mission Planning, and Analysis Spacecraft Simulation) that is the basis for their new human space mission simulations and analysis. In this paper, we will describe the collaborative process with the JPL Dartslab and the JSC MOD team that resulted in the redesign and enhancement of the DSENDS software. We will outline the improvements in DSENDS that simplify creation of new high-fidelity robotic/spacecraft simulations. We will illustrate how DSENDS simulations are assembled and show results from several mission simulations.
BioASF: a framework for automatically generating executable pathway models specified in BioPAX.
Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap
2016-06-15
Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Sundberg, R.; Moberg, A.; Hind, A.
2012-08-01
A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.
A general CFD framework for fault-resilient simulations based on multi-resolution information fusion
NASA Astrophysics Data System (ADS)
Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em
2017-10-01
We develop a general CFD framework for multi-resolution simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy, in space-time, simulated fields. We combine approximation theory and domain decomposition together with statistical learning techniques, e.g. coKriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation (a) on a small number of spatial "patches" distributed across the domain, simulated by finite differences at fine resolution and (b) on the entire domain simulated at very low resolution, thus fusing multi-resolution models to obtain the final answer. Second, we simulate the flow in a lid-driven cavity in an analogous fashion, by fusing finite difference solutions obtained with fine and low resolution assuming gappy data sets. We investigate the influence of various parameters for this framework, including the correlation kernel, the size of a buffer employed in estimating boundary conditions, the coarseness of the resolution of auxiliary data, and the communication frequency across different patches in fusing the information at different resolution levels. In addition to its robustness and resilience, the new framework can be employed to generalize previous multiscale approaches involving heterogeneous discretizations or even fundamentally different flow descriptions, e.g. in continuum-atomistic simulations.
NASA Technical Reports Server (NTRS)
Venkatachari, Balaji Shankar; Streett, Craig L.; Chang, Chau-Lyan; Friedlander, David J.; Wang, Xiao-Yen; Chang, Sin-Chung
2016-01-01
Despite decades of development of unstructured mesh methods, high-fidelity time-accurate simulations are still predominantly carried out on structured, or unstructured hexahedral meshes by using high-order finite-difference, weighted essentially non-oscillatory (WENO), or hybrid schemes formed by their combinations. In this work, the space-time conservation element solution element (CESE) method is used to simulate several flow problems including supersonic jet/shock interaction and its impact on launch vehicle acoustics, and direct numerical simulations of turbulent flows using tetrahedral meshes. This paper provides a status report for the continuing development of the space-time conservation element solution element (CESE) numerical and software framework under the Revolutionary Computational Aerosciences (RCA) project. Solution accuracy and large-scale parallel performance of the numerical framework is assessed with the goal of providing a viable paradigm for future high-fidelity flow physics simulations.
LIPID11: A Modular Framework for Lipid Simulations using Amber
Skjevik, Åge A.; Madej, Benjamin D.; Walker, Ross C.; eigen, Knut T
2013-01-01
Accurate simulation of complex lipid bilayers has long been a goal in condensed phase molecular dynamics (MD). Structure and function of membrane-bound proteins are highly dependent on the lipid bilayer environment and are challenging to study through experimental methods. Within Amber, there has been limited focus on lipid simulations, although some success has been seen with the use of the General Amber Force Field (GAFF). However, to date there are no dedicated Amber lipid force fields. In this paper we describe a new charge derivation strategy for lipids consistent with the Amber RESP approach, and a new atom and residue naming and type convention. In the first instance, we have combined this approach with GAFF parameters. The result is LIPID11, a flexible, modular framework for the simulation of lipids that is fully compatible with the existing Amber force fields. The charge derivation procedure, capping strategy and nomenclature for LIPID11, along with preliminary simulation results and a discussion of the planned long-term parameter development are presented here. Our findings suggest that Lipid11 is a modular framework feasible for phospholipids and a flexible starting point for the development of a comprehensive, Amber-compatible lipid force field. PMID:22916730
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamid, A. H. A., E-mail: amyhamijah@gmail.com, E-mail: amyhamijah@nm.gov.my; Universiti Malaysia Kelantan; Rozan, M. Z. A., E-mail: drmohdzaidi@gmail.com
The evolution of current Radiation and Nuclear Emergency Planning Framework (RANEPF) simulator emphasizes on the human factors to be analyzed and interpreted according to the stakeholder’s tacit and explicit knowledge. These human factor criteria are analyzed and interpreted according to the “sense making theory” and Disaster Emergency Response Management Information System (DERMIS) design premises. These criteria are corroborated by the statistical criteria. In recent findings, there were no differences of distributions among the stakeholders according to gender and organizational expertise. These criteria are incrementally accepted and agreed the research elements indicated in the respective emergency planning frameworks and simulator (i.e.more » 78.18 to 84.32, p-value <0.05). This paper suggested these human factors criteria in the associated analyses and theoretical perspectives to be further acomodated in the future simulator development. This development is in conjunction with the proposed hypothesis building of the process factors and responses diagram. We proposed that future work which implies the additional functionality of the simulator, as strategized, condensed and concise, comprehensive public disaster preparedness and intervention guidelines, to be a useful and efficient computer simulation.« less
NASA Astrophysics Data System (ADS)
Sewell, Stephen
This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.
Development Issues for Lunar Regolith Simulants
NASA Technical Reports Server (NTRS)
Rickman, Doug; Carpenter, Paul; Sibille, Laurent; Owens, Charles; French, Raymond; McLemore, Carole
2006-01-01
Significant challenges and logistical issues exist for the development of standardized lunar regolith simulant (SLRS) materials for use in the development and testing of flight hardware for upcoming NASA lunar missions. A production program at Marshall Space Flight Center (MSFC) for the deployment of lunar mare basalt simulant JSC-lA is underway. Root simulants have been proposed for the development of a low-T mare basalt simulant and a high-Ca highland anorthosite simulant, as part of a framework of simulant development outlined in the 2005 Lunar Regolith Simulant Materials Workshop held at MSFC. Many of the recommendation for production and standardization of simulants have already been documented by the MSFC team. But there are a number of unanswered questions related to geology which need ta be addressed prior to the creation of the simulants.
Using Simulation in a Psychiatric Mental Health Nurse Practitioner Doctoral Program.
Calohan, Jess; Pauli, Eric; Combs, Teresa; Creel, Andrea; Convoy, Sean; Owen, Regina
The use and effectiveness of simulation with standardized patients in undergraduate and graduate nursing education programs is well documented. Simulation has been primarily used to develop health assessment skills. Evidence supports using simulation and standardized patients in psychiatric-mental health nurse practitioner (PMHNP) programs is useful in developing psychosocial assessment skills. These interactions provide individualized and instantaneous clinical feedback to the student from faculty, peers, and standardized patients. Incorporating simulation into advanced practice psychiatric-mental health nursing curriculum allows students to develop the necessary requisite skills and principles needed to safely and effectively provide care to patients. There are no documented standardized processes for using simulation throughout a doctor of nursing practice PMHNP curriculum. The purpose of this article is to describe a framework for using simulation with standardized patients in a PMHNP curriculum. Students report high levels of satisfaction with the simulation experience and believe that they are more prepared for clinical rotations. Faculty feedback indicates that simulated clinical scenarios are a method to ensure that each student experiences demonstrate a minimum standard of competency ahead of clinical rotations with live patients. Initial preceptor feedback indicates that students are more prepared for clinical practice and function more independently than students that did not experience this standardized clinical simulation framework. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Otto, John C.; Paraschivoiu, Marius; Yesilyurt, Serhat; Patera, Anthony T.
1995-01-01
Engineering design and optimization efforts using computational systems rapidly become resource intensive. The goal of the surrogate-based approach is to perform a complete optimization with limited resources. In this paper we present a Bayesian-validated approach that informs the designer as to how well the surrogate performs; in particular, our surrogate framework provides precise (albeit probabilistic) bounds on the errors incurred in the surrogate-for-simulation substitution. The theory and algorithms of our computer{simulation surrogate framework are first described. The utility of the framework is then demonstrated through two illustrative examples: maximization of the flowrate of fully developed ow in trapezoidal ducts; and design of an axisymmetric body that achieves a target Stokes drag.
NASA Astrophysics Data System (ADS)
van der Plas, Peter; Guerriero, Suzanne; Cristiano, Leorato; Rugina, Ana
2012-08-01
Modelling and simulation can support a number of use cases across the spacecraft development life-cycle. Given the increasing complexity of space missions, the observed general trend is for a more extensive usage of simulation already in the early phases. A major perceived advantage is that modelling and simulation can enable the validation of critical aspects of the spacecraft design before the actual development is started, as such reducing the risk in later phases.Failure Detection, Isolation, and Recovery (FDIR) is one of the areas with a high potential to benefit from early modelling and simulation. With the increasing level of required spacecraft autonomy, FDIR specifications can grow in such a way that the traditional document-based review process soon becomes inadequate.This paper shows that FDIR modelling and simulation in a system context can provide a powerful tool to support the FDIR verification process. It is highlighted that FDIR modelling at this early stage requires heterogeneous modelling tools and languages, in order to provide an adequate functional description of the different components (i.e. FDIR functions, environment, equipment, etc.) to be modelled.For this reason, an FDIR simulation framework is proposed in this paper. This framework is based on a number of tools already available in the Avionics Systems Laboratory at ESTEC, which are the Avionics Test Bench Functional Engineering Simulator (ATB FES), Matlab/Simulink, TASTE, and Real Time Developer Studio (RTDS).The paper then discusses the application of the proposed simulation framework to a real case-study, i.e. the FDIR modelling of a satellite in support of actual ESA mission. Challenges and benefits of the approach are described. Finally, lessons learned and the generality of the proposed approach are discussed.
NASA Astrophysics Data System (ADS)
Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.
2012-04-01
Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auld, Joshua; Hope, Michael; Ley, Hubert
This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typicallymore » done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.« less
NASA Technical Reports Server (NTRS)
Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura
2007-01-01
The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.
A framework for porting the NeuroBayes machine learning algorithm to FPGAs
NASA Astrophysics Data System (ADS)
Baehr, S.; Sander, O.; Heck, M.; Feindt, M.; Becker, J.
2016-01-01
The NeuroBayes machine learning algorithm is deployed for online data reduction at the pixel detector of Belle II. In order to test, characterize and easily adapt its implementation on FPGAs, a framework was developed. Within the framework an HDL model, written in python using MyHDL, is used for fast exploration of possible configurations. Under usage of input data from physics simulations figures of merit like throughput, accuracy and resource demand of the implementation are evaluated in a fast and flexible way. Functional validation is supported by usage of unit tests and HDL simulation for chosen configurations.
Event-based simulation of networks with pulse delayed coupling
NASA Astrophysics Data System (ADS)
Klinshov, Vladimir; Nekorkin, Vladimir
2017-10-01
Pulse-mediated interactions are common in networks of different nature. Here we develop a general framework for simulation of networks with pulse delayed coupling. We introduce the discrete map governing the dynamics of such networks and describe the computation algorithm for its numerical simulation.
The Water Quality Analysis Simulation Program (WASP) is a dynamic, spatially-resolved, differential mass balance fate and transport modeling framework. WASP is used to develop models to simulate concentrations of environmental contaminants in surface waters and sediments. As a mo...
NASA Astrophysics Data System (ADS)
De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael
2017-04-01
Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M.: "Flexible Simulation Framework to Couple Processes in Complex 3D Models for Subsurface Utilization Assessment.", Energy Procedia, 97, 2016 p. 494-501.
The Trick Simulation Toolkit: A NASA/Open source Framework for Running Time Based Physics Models
NASA Technical Reports Server (NTRS)
Penn, John M.; Lin, Alexander S.
2016-01-01
This paper describes the design and use at of the Trick Simulation Toolkit, a simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes Trick's design goals and how the development environment attempts to achieve those goals. It describes how Trick is used in some of the many training and engineering simulations at NASA. Finally it describes the Trick NASA/Open source project on Github.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bragg-Sitton, Shannon Michelle; Rabiti, Cristian; Kinoshita, Robert Arthur
An effort to design and build a modeling and simulation framework to assess the economic viability of Nuclear Hybrid Energy Systems (NHES) was undertaken in fiscal year 2015 (FY15). The purpose of this report is to document the various tasks associated with the development of such a framework and to provide a status on its progress. Several tasks have been accomplished. First, starting from a simulation strategy, a rigorous mathematical formulation has been achieved in which the economic optimization of a Nuclear Hybrid Energy System is presented as a constrained robust (under uncertainty) optimization problem. Some possible algorithms for themore » solution of the optimization problem are presented. A variation of the Simultaneous Perturbation Stochastic Approximation algorithm has been implemented in RAVEN and preliminary tests have been performed. The development of the software infrastructure to support the simulation of the whole NHES has also moved forward. The coupling between RAVEN and an implementation of the Modelica language (OpenModelica) has been implemented, migrated under several operating systems and tested using an adapted model of a desalination plant. In particular, this exercise was focused on testing the coupling of the different code systems; testing parallel, computationally expensive simulations on the INL cluster; and providing a proof of concept for the possibility of using surrogate models to represent the different NHES subsystems. Another important step was the porting of the RAVEN code under the Windows™ operating system. This accomplishment makes RAVEN compatible with the development environment that is being used for dynamic simulation of NHES components. A very simplified model of a NHES on the electric market has been built in RAVEN to confirm expectations on the analysis capability of RAVEN to provide insight into system economics and to test the capability of RAVEN to identify limit surfaces even for stochastic constraints. This capability will be needed in the future to enforce the stochastic constraints on the electric demand coverage from the NHES. The development team gained experience with many of the tools that are currently envisioned for use in the economic analysis of NHES and completed several important steps. Given the complexity of the project, preference has been given to a structural approach in which several independent efforts have been used to build the cornerstone of the simulation framework. While this is good approach in establishing such a complex framework, it may delay reaching more complete results on the performance of analyzed system configurations. The integration of the previously reported exergy analysis approach was initially proposed as part of this milestone. However, in reality, the exergy-based apportioning of cost will take place only in a second stage of the implementation since it will be used to properly allocate cost among the different NHES subsystems. Therefore, exergy does not appear at the level of the main drivers in the analysis framework; the latter development of the base framework is the focus of this report.« less
Towards an interactive electromechanical model of the heart
Talbot, Hugo; Marchesseau, Stéphanie; Duriez, Christian; Sermesant, Maxime; Cotin, Stéphane; Delingette, Hervé
2013-01-01
In this work, we develop an interactive framework for rehearsal of and training in cardiac catheter ablation, and for planning cardiac resynchronization therapy. To this end, an interactive and real-time electrophysiology model of the heart is developed to fit patient-specific data. The proposed interactive framework relies on two main contributions. First, an efficient implementation of cardiac electrophysiology is proposed, using the latest graphics processing unit computing techniques. Second, a mechanical simulation is then coupled to the electrophysiological signals to produce realistic motion of the heart. We demonstrate that pathological mechanical and electrophysiological behaviour can be simulated. PMID:24427533
Burton, Brett M; Aras, Kedar K; Good, Wilson W; Tate, Jess D; Zenger, Brian; MacLeod, Rob S
2018-05-21
The biophysical basis for electrocardiographic evaluation of myocardial ischemia stems from the notion that ischemic tissues develop, with relative uniformity, along the endocardial aspects of the heart. These injured regions of subendocardial tissue give rise to intramural currents that lead to ST segment deflections within electrocardiogram (ECG) recordings. The concept of subendocardial ischemic regions is often used in clinical practice, providing a simple and intuitive description of ischemic injury; however, such a model grossly oversimplifies the presentation of ischemic disease-inadvertently leading to errors in ECG-based diagnoses. Furthermore, recent experimental studies have brought into question the subendocardial ischemia paradigm suggesting instead a more distributed pattern of tissue injury. These findings come from experiments and so have both the impact and the limitations of measurements from living organisms. Computer models have often been employed to overcome the constraints of experimental approaches and have a robust history in cardiac simulation. To this end, we have developed a computational simulation framework aimed at elucidating the effects of ischemia on measurable cardiac potentials. To validate our framework, we simulated, visualized, and analyzed 226 experimentally derived acute myocardial ischemic events. Simulation outcomes agreed both qualitatively (feature comparison) and quantitatively (correlation, average error, and significance) with experimentally obtained epicardial measurements, particularly under conditions of elevated ischemic stress. Our simulation framework introduces a novel approach to incorporating subject-specific, geometric models and experimental results that are highly resolved in space and time into computational models. We propose this framework as a means to advance the understanding of the underlying mechanisms of ischemic disease while simultaneously putting in place the computational infrastructure necessary to study and improve ischemia models aimed at reducing diagnostic errors in the clinic.
An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation
NASA Technical Reports Server (NTRS)
Reid, Michael R.; Powers, Edward I. (Technical Monitor)
2000-01-01
The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.
A Unified Framework for Analyzing and Designing for Stationary Arterial Networks
DOT National Transportation Integrated Search
2017-05-17
This research aims to develop a unified theoretical and simulation framework for analyzing and designing signals for stationary arterial networks. Existing traffic flow models used in design and analysis of signal control strategies are either too si...
Building energy simulation in real time through an open standard interface
Pang, Xiufeng; Nouidui, Thierry S.; Wetter, Michael; ...
2015-10-20
Building energy models (BEMs) are typically used for design and code compliance for new buildings and in the renovation of existing buildings to predict energy use. We present the increasing adoption of BEM as standard practice in the building industry presents an opportunity to extend the use of BEMs into construction, commissioning and operation. In 2009, the authors developed a real-time simulation framework to execute an EnergyPlus model in real time to improve building operation. This paper reports an enhancement of that real-time energy simulation framework. The previous version only works with software tools that implement the custom co-simulation interfacemore » of the Building Controls Virtual Test Bed (BCVTB), such as EnergyPlus, Dymola and TRNSYS. The new version uses an open standard interface, the Functional Mockup Interface (FMI), to provide a generic interface to any application that supports the FMI protocol. In addition, the new version utilizes the Simple Measurement and Actuation Profile (sMAP) tool as the data acquisition system to acquire, store and present data. Lastly, this paper introduces the updated architecture of the real-time simulation framework using FMI and presents proof-of-concept demonstration results which validate the new framework.« less
Simulation of investment returns of toll projects.
DOT National Transportation Integrated Search
2013-08-01
This research develops a methodological framework to illustrate key stages in applying the simulation of investment returns of toll projects, acting as an example process of helping agencies conduct numerical risk analysis by taking certain uncertain...
High Performance Structures and Materials
advanced simulation and optimization methods that can be used during the early design stages of innovative Development of Simulation Model Validation Framework for RBDO Sponsored by U.S. Army TARDEC Visit Us Contact
The role of simulation in mixed-methods research: a framework & application to patient safety.
Guise, Jeanne-Marie; Hansen, Matthew; Lambert, William; O'Brien, Kerth
2017-05-04
Research in patient safety is an important area of health services research and is a national priority. It is challenging to investigate rare occurrences, explore potential causes, and account for the complex, dynamic context of healthcare - yet all are required in patient safety research. Simulation technologies have become widely accepted as education and clinical tools, but have yet to become a standard tool for research. We developed a framework for research that integrates accepted patient safety models with mixed-methods research approaches and describe the performance of the framework in a working example of a large National Institutes of Health (NIH)-funded R01 investigation. This worked example of a framework in action, identifies the strengths and limitations of qualitative and quantitative research approaches commonly used in health services research. Each approach builds essential layers of knowledge. We describe how the use of simulation ties these layers of knowledge together and adds new and unique dimensions of knowledge. A mixed-methods research approach that includes simulation provides a broad multi-dimensional approach to health services and patient safety research.
Framework for modeling urban restoration resilience time in the aftermath of an extreme event
Ramachandran, Varun; Long, Suzanna K.; Shoberg, Thomas G.; Corns, Steven; Carlo, Héctor
2015-01-01
The impacts of extreme events continue long after the emergency response has terminated. Effective reconstruction of supply-chain strategic infrastructure (SCSI) elements is essential for postevent recovery and the reconnectivity of a region with the outside. This study uses an interdisciplinary approach to develop a comprehensive framework to model resilience time. The framework is tested by comparing resilience time results for a simulated EF-5 tornado with ground truth data from the tornado that devastated Joplin, Missouri, on May 22, 2011. Data for the simulated tornado were derived for Overland Park, Johnson County, Kansas, in the greater Kansas City, Missouri, area. Given the simulated tornado, a combinatorial graph considering the damages in terms of interconnectivity between different SCSI elements is derived. Reconstruction in the aftermath of the simulated tornado is optimized using the proposed framework to promote a rapid recovery of the SCSI. This research shows promising results when compared with the independent quantifiable data obtained from Joplin, Missouri, returning a resilience time of 22 days compared with 25 days reported by city and state officials.
Representing Water Scarcity in Future Agricultural Assessments
NASA Technical Reports Server (NTRS)
Winter, Jonathan M.; Lopez, Jose R.; Ruane, Alexander C.; Young, Charles A.; Scanlon, Bridget R.; Rosenzweig, Cynthia
2017-01-01
Globally, irrigated agriculture is both essential for food production and the largest user of water. A major challenge for hydrologic and agricultural research communities is assessing the sustainability of irrigated croplands under climate variability and change. Simulations of irrigated croplands generally lack key interactions between water supply, water distribution, and agricultural water demand. In this article, we explore the critical interface between water resources and agriculture by motivating, developing, and illustrating the application of an integrated modeling framework to advance simulations of irrigated croplands. We motivate the framework by examining historical dynamics of irrigation water withdrawals in the United States and quantitatively reviewing previous modeling studies of irrigated croplands with a focus on representations of water supply, agricultural water demand, and impacts on crop yields when water demand exceeds water supply. We then describe the integrated modeling framework for simulating irrigated croplands, which links trends and scenarios with water supply, water allocation, and agricultural water demand. Finally, we provide examples of efforts that leverage the framework to improve simulations of irrigated croplands as well as identify opportunities for interventions that increase agricultural productivity, resiliency, and sustainability.
GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments.
Monroy, Javier; Hernandez-Bennets, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier
2017-06-23
This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment.
GADEN: A 3D Gas Dispersion Simulator for Mobile Robot Olfaction in Realistic Environments
Hernandez-Bennetts, Victor; Fan, Han; Lilienthal, Achim; Gonzalez-Jimenez, Javier
2017-01-01
This work presents a simulation framework developed under the widely used Robot Operating System (ROS) to enable the validation of robotics systems and gas sensing algorithms under realistic environments. The framework is rooted in the principles of computational fluid dynamics and filament dispersion theory, modeling wind flow and gas dispersion in 3D real-world scenarios (i.e., accounting for walls, furniture, etc.). Moreover, it integrates the simulation of different environmental sensors, such as metal oxide gas sensors, photo ionization detectors, or anemometers. We illustrate the potential and applicability of the proposed tool by presenting a simulation case in a complex and realistic office-like environment where gas leaks of different chemicals occur simultaneously. Furthermore, we accomplish quantitative and qualitative validation by comparing our simulated results against real-world data recorded inside a wind tunnel where methane was released under different wind flow profiles. Based on these results, we conclude that our simulation framework can provide a good approximation to real world measurements when advective airflows are present in the environment. PMID:28644375
ERIC Educational Resources Information Center
Monaghan, James M.; Clement, John
1999-01-01
Presents evidence for students' qualitative and quantitative difficulties with apparently simple one-dimensional relative-motion problems, students' spontaneous visualization of relative-motion problems, the visualizations facilitating solution of these problems, and students' memories of the online computer simulation used as a framework for…
2014-04-30
cost to acquire systems as design maturity could be verified incrementally as the system was developed vice waiting for specific large “ big bang ...Framework (MBAF) be applied to simulate or optimize process variations on programs? LSI Roles and Responsibilities A review of the roles and...the model/process optimization process. It is the current intent that NAVAIR will use the model to run simulations on process changes in an attempt to
Kilinc, Deniz; Demir, Alper
2017-08-01
The brain is extremely energy efficient and remarkably robust in what it does despite the considerable variability and noise caused by the stochastic mechanisms in neurons and synapses. Computational modeling is a powerful tool that can help us gain insight into this important aspect of brain mechanism. A deep understanding and computational design tools can help develop robust neuromorphic electronic circuits and hybrid neuroelectronic systems. In this paper, we present a general modeling framework for biological neuronal circuits that systematically captures the nonstationary stochastic behavior of ion channels and synaptic processes. In this framework, fine-grained, discrete-state, continuous-time Markov chain models of both ion channels and synaptic processes are treated in a unified manner. Our modeling framework features a mechanism for the automatic generation of the corresponding coarse-grained, continuous-state, continuous-time stochastic differential equation models for neuronal variability and noise. Furthermore, we repurpose non-Monte Carlo noise analysis techniques, which were previously developed for analog electronic circuits, for the stochastic characterization of neuronal circuits both in time and frequency domain. We verify that the fast non-Monte Carlo analysis methods produce results with the same accuracy as computationally expensive Monte Carlo simulations. We have implemented the proposed techniques in a prototype simulator, where both biological neuronal and analog electronic circuits can be simulated together in a coupled manner.
2013 strategic petroleum reserve big hill well integrity grading report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lord, David L.; Roberts, Barry L.; Lord, Anna C. Snider
2014-02-01
This report summarizes the work performed in developing a framework for the prioritization of cavern access wells for remediation and monitoring at the Big Hill Strategic Petroleum Reserve site. This framework was then applied to all 28 wells at the Big Hill site with each well receiving a grade for remediation and monitoring. Numerous factors affecting well integrity were incorporated into the grading framework including casing survey results, cavern pressure history, results from geomechanical simulations, and site geologic factors. The framework was developed in a way as to be applicable to all four of the Strategic Petroleum Reserve sites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chatzidakis, Stylianos; Greulich, Christopher
A cosmic ray Muon Flexible Framework for Spectral GENeration for Monte Carlo Applications (MUFFSgenMC) has been developed to support state-of-the-art cosmic ray muon tomographic applications. The flexible framework allows for easy and fast creation of source terms for popular Monte Carlo applications like GEANT4 and MCNP. This code framework simplifies the process of simulations used for cosmic ray muon tomography.
NASA Astrophysics Data System (ADS)
Jin, D.; Hoagland, P.; Dalton, T. M.; Thunberg, E. M.
2012-09-01
We present an integrated economic-ecological framework designed to help assess the implementation of ecosystem-based fisheries management (EBFM) in New England. We develop the framework by linking a computable general equilibrium (CGE) model of a coastal economy to an end-to-end (E2E) model of a marine food web for Georges Bank. We focus on the New England region using coastal county economic data for a restricted set of industry sectors and marine ecological data for three top level trophic feeding guilds: planktivores, benthivores, and piscivores. We undertake numerical simulations to model the welfare effects of changes in alternative combinations of yields from feeding guilds and alternative manifestations of biological productivity. We estimate the economic and distributional effects of these alternative simulations across a range of consumer income levels. This framework could be used to extend existing methodologies for assessing the impacts on human communities of groundfish stock rebuilding strategies, such as those expected through the implementation of the sector management program in the US northeast fishery. We discuss other possible applications of and modifications and limitations to the framework.
Smoothed Particle Hydrodynamic Simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-10-05
This code is a highly modular framework for developing smoothed particle hydrodynamic (SPH) simulations running on parallel platforms. The compartmentalization of the code allows for rapid development of new SPH applications and modifications of existing algorithms. The compartmentalization also allows changes in one part of the code used by many applications to instantly be made available to all applications.
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.; Nagashima, Sam O.; Espinosa, Paul D.; Berka, Chris; Baker, Eva L.
2009-01-01
In this report, researchers examined rifle marksmanship development within a skill development framework outlined by Chung, Delacruz, de Vries, Bewley, and Baker (2006). Thirty-three novice shooters used an M4 rifle training simulator system to learn to shoot an 8-inch target at a simulated distance of 200 yards. Cognitive, psychomotor, and…
ERIC Educational Resources Information Center
Peacock, Christopher
2012-01-01
The purpose of this research effort was to develop a model that provides repeatable Location Management (LM) testing using a network simulation tool, QualNet version 5.1 (2011). The model will provide current and future protocol developers a framework to simulate stable protocol environments for development. This study used the Design Science…
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Kopasakis, George; Lemon, Kimberly A.
2010-01-01
A turbofan simulation has been developed for use in aero-propulso-servo-elastic coupling studies, on supersonic vehicles. A one-dimensional lumped volume approach is used whereby each component (fan, high-pressure compressor, combustor, etc.) is represented as a single volume using characteristic performance maps and conservation equations for continuity, momentum and energy. The simulation is developed in the MATLAB/SIMULINK (The MathWorks, Inc.) environment in order to facilitate controls development, and ease of integration with a future aero-servo-elastic vehicle model being developed at NASA Langley. The complete simulation demonstrated steady state results that closely match a proposed engine suitable for a supersonic business jet at the cruise condition. Preliminary investigation of the transient simulation revealed expected trends for fuel flow disturbances as well as upstream pressure disturbances. A framework for system identification enables development of linear models for controller design. Utilizing this framework, a transfer function modeling an upstream pressure disturbance s impacts on the engine speed is developed as an illustrative case of the system identification. This work will eventually enable an overall vehicle aero-propulso-servo-elastic model
A Metascalable Computing Framework for Large Spatiotemporal-Scale Atomistic Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nomura, K; Seymour, R; Wang, W
2009-02-17
A metascalable (or 'design once, scale on new architectures') parallel computing framework has been developed for large spatiotemporal-scale atomistic simulations of materials based on spatiotemporal data locality principles, which is expected to scale on emerging multipetaflops architectures. The framework consists of: (1) an embedded divide-and-conquer (EDC) algorithmic framework based on spatial locality to design linear-scaling algorithms for high complexity problems; (2) a space-time-ensemble parallel (STEP) approach based on temporal locality to predict long-time dynamics, while introducing multiple parallelization axes; and (3) a tunable hierarchical cellular decomposition (HCD) parallelization framework to map these O(N) algorithms onto a multicore cluster based onmore » hybrid implementation combining message passing and critical section-free multithreading. The EDC-STEP-HCD framework exposes maximal concurrency and data locality, thereby achieving: (1) inter-node parallel efficiency well over 0.95 for 218 billion-atom molecular-dynamics and 1.68 trillion electronic-degrees-of-freedom quantum-mechanical simulations on 212,992 IBM BlueGene/L processors (superscalability); (2) high intra-node, multithreading parallel efficiency (nanoscalability); and (3) nearly perfect time/ensemble parallel efficiency (eon-scalability). The spatiotemporal scale covered by MD simulation on a sustained petaflops computer per day (i.e. petaflops {center_dot} day of computing) is estimated as NT = 2.14 (e.g. N = 2.14 million atoms for T = 1 microseconds).« less
David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert
2011-01-01
In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...
NASA Technical Reports Server (NTRS)
Penn, John M.
2013-01-01
This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA/Johnson Space Center and many other NASA facilities. It describes what was learned and the significant benefits seen, such as fast, thorough, and clear test feedback every time code is checked-in to the code repository. It also describes a system that encourages development of code that is much more flexible, maintainable, and reliable. The Trick Simulation Toolkit development environment provides a common architecture for user-defined simulations. Trick builds executable simulations using user-supplied simulation-definition files (S_define) and user supplied "model code". For each Trick-based simulation, Trick automatically provides job scheduling, checkpoint / restore, data-recording, interactive variable manipulation (variable server), and an input-processor. Also included are tools for plotting recorded data and various other supporting tools and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX. Prior to adopting this new development approach, Trick testing consisted primarily of running a few large simulations, with the hope that their complexity and scale would exercise most of Trick's code and expose any recently introduced bugs. Unsurprising, this approach yielded inconsistent results. It was obvious that a more systematic, thorough approach was required. After seeing examples of some Java-based projects that used the JUnit test framework, similar test frameworks for C and C++ were sought. Several were found, all clearly inspired by JUnit. Googletest, a freely available Open source testing framework, was selected as the most appropriate and capable. The new approach was implemented while rewriting the Trick memory management component, to eliminate a fundamental design flaw. The benefits became obvious almost immediately, not just in the correctness of the individual functions and classes but also in the correctness and flexibility being added to the overall design. Creating code to be testable, and testing as it was created resulted not only in better working code, but also in better-organized, flexible, and readable (i.e., articulate) code. This was, in essence the Test-driven development (TDD) methodology created by Kent Beck. Seeing the benefits of Test Driven Development, other Trick components were refactored to make them more testable and tests were designed and implemented for them.
NASA Astrophysics Data System (ADS)
Vienhage, Paul; Barcomb, Heather; Marshall, Karel; Black, William A.; Coons, Amanda; Tran, Hien T.; Nguyen, Tien M.; Guillen, Andy T.; Yoh, James; Kizer, Justin; Rogers, Blake A.
2017-05-01
The paper describes the MATLAB (MathWorks) programs that were developed during the REU workshop1 to implement The Aerospace Corporation developed Unified Game-based Acquisition Framework and Advanced Game - based Mathematical Framework (UGAF-AGMF) and its associated War-Gaming Engine (WGE) models. Each game can be played from the perspectives of the Department of Defense Acquisition Authority (DAA) or of an individual contractor (KTR). The programs also implement Aerospace's optimum "Program and Technical Baseline (PTB) and associated acquisition" strategy that combines low Total Ownership Cost (TOC) with innovative designs while still meeting warfighter needs. The paper also describes the Bayesian Acquisition War-Gaming approach using Monte Carlo simulations, a numerical analysis technique to account for uncertainty in decision making, which simulate the PTB development and acquisition processes and will detail the procedure of the implementation and the interactions between the games.
Some theoretical issues on computer simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, C.L.; Reidys, C.M.
1998-02-01
The subject of this paper is the development of mathematical foundations for a theory of simulation. Sequentially updated cellular automata (sCA) over arbitrary graphs are employed as a paradigmatic framework. In the development of the theory, the authors focus on the properties of causal dependencies among local mappings in a simulation. The main object of and study is the mapping between a graph representing the dependencies among entities of a simulation and a representing the equivalence classes of systems obtained by all possible updates.
Fortran interface layer of the framework for developing particle simulator FDPS
NASA Astrophysics Data System (ADS)
Namekata, Daisuke; Iwasawa, Masaki; Nitadori, Keigo; Tanikawa, Ataru; Muranushi, Takayuki; Wang, Long; Hosono, Natsuki; Nomura, Kentaro; Makino, Junichiro
2018-06-01
Numerical simulations based on particle methods have been widely used in various fields including astrophysics. To date, various versions of simulation software have been developed by individual researchers or research groups in each field, through a huge amount of time and effort, even though the numerical algorithms used are very similar. To improve the situation, we have developed a framework, called FDPS (Framework for Developing Particle Simulators), which enables researchers to develop massively parallel particle simulation codes for arbitrary particle methods easily. Until version 3.0, FDPS provided an API (application programming interface) for the C++ programming language only. This limitation comes from the fact that FDPS is developed using the template feature in C++, which is essential to support arbitrary data types of particle. However, there are many researchers who use Fortran to develop their codes. Thus, the previous versions of FDPS require such people to invest much time to learn C++. This is inefficient. To cope with this problem, we developed a Fortran interface layer in FDPS, which provides API for Fortran. In order to support arbitrary data types of particle in Fortran, we design the Fortran interface layer as follows. Based on a given derived data type in Fortran representing particle, a PYTHON script provided by us automatically generates a library that manipulates the C++ core part of FDPS. This library is seen as a Fortran module providing an API of FDPS from the Fortran side and uses C programs internally to interoperate Fortran with C++. In this way, we have overcome several technical issues when emulating a `template' in Fortran. Using the Fortran interface, users can develop all parts of their codes in Fortran. We show that the overhead of the Fortran interface part is sufficiently small and a code written in Fortran shows a performance practically identical to the one written in C++.
Evaluation of a performance appraisal framework for radiation therapists in planning and simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, Jillian, E-mail: jillian.becker@health.qld.gov.au; Bridge, Pete; Brown, Elizabeth
2015-06-15
Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback onmore » its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.« less
Faculty Descriptions of Simulation Debriefing in Traditional Baccalaureate Nursing Programs.
Waznonis, Annette R
A study was conducted to describe simulation debriefing practices of faculty in accredited, traditional, baccalaureate nursing programs in the United States. Best debriefing practices include debriefing by a competent facilitator in a safe environment using a structured framework. Yet, structured frameworks and evaluation of debriefing are lacking in nursing education. This article reports the interview findings from the qualitative component of a large-scale mixed-methods study. Twenty-three full-time faculty members with an average of 6 years of simulation debriefing experience participated in interviews. Three themes emerged with subthemes: a) having the student's best interest at heart, b) getting over the emotional hurdle, and c) intentional debriefing evolves into learning. Gaps were found in faculty development, use of a structured framework, and evaluation. Research is warranted on use of video, postdebriefing assignments, cofacilitation, and debriefing effectiveness.
We have developed a modeling framework to support grid-based simulation of ecosystems at multiple spatial scales, the Ecological Component Library for Parallel Spatial Simulation (ECLPSS). ECLPSS helps ecologists to build robust spatially explicit simulations of ...
Self-organizing network services with evolutionary adaptation.
Nakano, Tadashi; Suda, Tatsuya
2005-09-01
This paper proposes a novel framework for developing adaptive and scalable network services. In the proposed framework, a network service is implemented as a group of autonomous agents that interact in the network environment. Agents in the proposed framework are autonomous and capable of simple behaviors (e.g., replication, migration, and death). In this paper, an evolutionary adaptation mechanism is designed using genetic algorithms (GAs) for agents to evolve their behaviors and improve their fitness values (e.g., response time to a service request) to the environment. The proposed framework is evaluated through simulations, and the simulation results demonstrate the ability of autonomous agents to adapt to the network environment. The proposed framework may be suitable for disseminating network services in dynamic and large-scale networks where a large number of data and services need to be replicated, moved, and deleted in a decentralized manner.
A modelling framework to simulate foliar fungal epidemics using functional–structural plant models
Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe
2014-01-01
Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both previously developed models for individual aspects of pathosystems and new ones. Complex models are deconstructed into separate ‘knowledge sources’ originating from different specialist areas of expertise and these can be shared and reassembled into multidisciplinary models. The framework thus provides a beneficial tool for a potential diverse and dynamic research community. PMID:24925323
Developing Cognitive Models for Social Simulation from Survey Data
NASA Astrophysics Data System (ADS)
Alt, Jonathan K.; Lieberman, Stephen
The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.
Simulating future residential property losses from wildfire in Flathead County, Montana: Chapter 1
Prato, Tony; Paveglio, Travis B; Barnett, Yan; Silverstein, Robin; Hardy, Michael; Keane, Robert; Loehman, Rachel A.; Clark, Anthony; Fagre, Daniel B.; Venn, Tyron; Stockmann, Keith
2014-01-01
Wildfire damages to private residences in the United States and elsewhere have increased as a result of expansion of the wildland-urban interface (WUI) and other factors. Understanding this unwelcome trend requires analytical frameworks that simulate how various interacting social, economic, and biophysical factors influence those damages. A methodological framework is developed for simulating expected residential property losses from wildfire [E(RLW)], which is a probabilistic monetary measure of wildfire risk to residential properties in the WUI. E(RLW) is simulated for Flathead County, Montana for five, 10-year subperiods covering the period 2010-2059, under various assumptions about future climate change, economic growth, land use policy, and forest management. Results show statistically significant increases in the spatial extent of WUI properties, the number of residential structures at risk from wildfire, and E(RLW) over the 50-year evaluation period for both the county and smaller subareas (i.e., neighborhoods and parcels). The E(RLW) simulation framework presented here advances the field of wildfire risk assessment by providing a finer-scale tool that incorporates a set of dynamic, interacting processes. The framework can be applied using other scenarios for climate change, economic growth, land use policy, and forest management, and in other areas.
Simulation of transmission electron microscope images of biological specimens.
Rullgård, H; Ofverstedt, L-G; Masich, S; Daneholt, B; Oktem, O
2011-09-01
We present a new approach to simulate electron cryo-microscope images of biological specimens. The framework for simulation consists of two parts; the first is a phantom generator that generates a model of a specimen suitable for simulation, the second is a transmission electron microscope simulator. The phantom generator calculates the scattering potential of an atomic structure in aqueous buffer and allows the user to define the distribution of molecules in the simulated image. The simulator includes a well defined electron-specimen interaction model based on the scalar Schrödinger equation, the contrast transfer function for optics, and a noise model that includes shot noise as well as detector noise including detector blurring. To enable optimal performance, the simulation framework also includes a calibration protocol for setting simulation parameters. To test the accuracy of the new framework for simulation, we compare simulated images to experimental images recorded of the Tobacco Mosaic Virus (TMV) in vitreous ice. The simulated and experimental images show good agreement with respect to contrast variations depending on dose and defocus. Furthermore, random fluctuations present in experimental and simulated images exhibit similar statistical properties. The simulator has been designed to provide a platform for development of new instrumentation and image processing procedures in single particle electron microscopy, two-dimensional crystallography and electron tomography with well documented protocols and an open source code into which new improvements and extensions are easily incorporated. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.
A Cellular Automaton Framework for Infectious Disease Spread Simulation
Pfeifer, Bernhard; Kugler, Karl; Tejada, Maria M; Baumgartner, Christian; Seger, Michael; Osl, Melanie; Netzer, Michael; Handler, Michael; Dander, Andreas; Wurz, Manfred; Graber, Armin; Tilg, Bernhard
2008-01-01
In this paper, a cellular automaton framework for processing the spatiotemporal spread of infectious diseases is presented. The developed environment simulates and visualizes how infectious diseases might spread, and hence provides a powerful instrument for health care organizations to generate disease prevention and contingency plans. In this study, the outbreak of an avian flu like virus was modeled in the state of Tyrol, and various scenarios such as quarantine, effect of different medications on viral spread and changes of social behavior were simulated. The proposed framework is implemented using the programming language Java. The set up of the simulation environment requires specification of the disease parameters and the geographical information using a population density colored map, enriched with demographic data. The results of the numerical simulations and the analysis of the computed parameters will be used to get a deeper understanding of how the disease spreading mechanisms work, and how to protect the population from contracting the disease. Strategies for optimization of medical treatment and vaccination regimens will also be investigated using our cellular automaton framework. In this study, six different scenarios were simulated. It showed that geographical barriers may help to slow down the spread of an infectious disease, however, when an aggressive and deadly communicable disease spreads, only quarantine and controlled medical treatment are able to stop the outbreak, if at all. PMID:19415136
A smoothed particle hydrodynamics framework for modelling multiphase interactions at meso-scale
NASA Astrophysics Data System (ADS)
Li, Ling; Shen, Luming; Nguyen, Giang D.; El-Zein, Abbas; Maggi, Federico
2018-01-01
A smoothed particle hydrodynamics (SPH) framework is developed for modelling multiphase interactions at meso-scale, including the liquid-solid interaction induced deformation of the solid phase. With an inter-particle force formulation that mimics the inter-atomic force in molecular dynamics, the proposed framework includes the long-range attractions between particles, and more importantly, the short-range repulsive forces to avoid particle clustering and instability problems. Three-dimensional numerical studies have been conducted to demonstrate the capabilities of the proposed framework to quantitatively replicate the surface tension of water, to model the interactions between immiscible liquids and solid, and more importantly, to simultaneously model the deformation of solid and liquid induced by the multiphase interaction. By varying inter-particle potential magnitude, the proposed SPH framework has successfully simulated various wetting properties ranging from hydrophobic to hydrophilic surfaces. The simulation results demonstrate the potential of the proposed framework to genuinely study complex multiphase interactions in wet granular media.
NASA Astrophysics Data System (ADS)
Uijt de Haag, Maarten; Venable, Kyle; Bezawada, Rajesh; Adami, Tony; Vadlamani, Ananth K.
2009-05-01
This paper discusses a sensor simulator/synthesizer framework that can be used to test and evaluate various sensor integration strategies for the implementation of an External Hazard Monitor (EHM) and Integrated Alerting and Notification (IAN) function as part of NASA's Integrated Intelligent Flight Deck (IIFD) project. The IIFD project under the NASA's Aviation Safety program "pursues technologies related to the flight deck that ensure crew workload and situational awareness are both safely optimized and adapted to the future operational environment as envisioned by NextGen." Within the simulation framework, various inputs to the IIFD and its subsystems, the EHM and IAN, are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. Sensors and avionics included in this framework are TCAS, ADS-B, Forward-Looking Infrared, Vision cameras, GPS, Inertial navigators, EGPWS, Laser Detection and Ranging sensors, altimeters, communication links with ATC, and weather radar. The framework is implemented in Simulink, a modeling language developed by The Mathworks. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft. Specifically, this paper addresses the architecture of the simulator, the sensor model interfaces, the timing and database (environment) aspects of the sensor models, the user interface of the modeling environment, and the various avionics implementations.
Numerical Propulsion System Simulation
NASA Technical Reports Server (NTRS)
Naiman, Cynthia
2006-01-01
The NASA Glenn Research Center, in partnership with the aerospace industry, other government agencies, and academia, is leading the effort to develop an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). NPSS is a framework for performing analysis of complex systems. The initial development of NPSS focused on the analysis and design of airbreathing aircraft engines, but the resulting NPSS framework may be applied to any system, for example: aerospace, rockets, hypersonics, power and propulsion, fuel cells, ground based power, and even human system modeling. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the NASA Aeronautics Research Mission Directorate Fundamental Aeronautics Program and the Advanced Virtual Engine Test Cell (AVETeC). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes capabilities to facilitate collaborative engineering. The NPSS will provide improved tools to develop custom components and to use capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities extend NPSS from a zero-dimensional simulation tool to a multi-fidelity, multidiscipline system-level simulation tool for the full development life cycle.
Patient-Specific Simulation of Cardiac Blood Flow From High-Resolution Computed Tomography.
Lantz, Jonas; Henriksson, Lilian; Persson, Anders; Karlsson, Matts; Ebbers, Tino
2016-12-01
Cardiac hemodynamics can be computed from medical imaging data, and results could potentially aid in cardiac diagnosis and treatment optimization. However, simulations are often based on simplified geometries, ignoring features such as papillary muscles and trabeculae due to their complex shape, limitations in image acquisitions, and challenges in computational modeling. This severely hampers the use of computational fluid dynamics in clinical practice. The overall aim of this study was to develop a novel numerical framework that incorporated these geometrical features. The model included the left atrium, ventricle, ascending aorta, and heart valves. The framework used image registration to obtain patient-specific wall motion, automatic remeshing to handle topological changes due to the complex trabeculae motion, and a fast interpolation routine to obtain intermediate meshes during the simulations. Velocity fields and residence time were evaluated, and they indicated that papillary muscles and trabeculae strongly interacted with the blood, which could not be observed in a simplified model. The framework resulted in a model with outstanding geometrical detail, demonstrating the feasibility as well as the importance of a framework that is capable of simulating blood flow in physiologically realistic hearts.
2018-01-01
Understanding Earth surface responses in terms of sediment dynamics to climatic variability and tectonics forcing is hindered by limited ability of current models to simulate long-term evolution of sediment transfer and associated morphological changes. This paper presents pyBadlands, an open-source python-based framework which computes over geological time (1) sediment transport from landmasses to coasts, (2) reworking of marine sediments by longshore currents and (3) development of coral reef systems. pyBadlands is cross-platform, distributed under the GPLv3 license and available on GitHub (http://github.com/badlands-model). Here, we describe the underlying physical assumptions behind the simulated processes and the main options already available in the numerical framework. Along with the source code, a list of hands-on examples is provided that illustrates the model capabilities. In addition, pre and post-processing classes have been built and are accessible as a companion toolbox which comprises a series of workflows to efficiently build, quantify and explore simulation input and output files. While the framework has been primarily designed for research, its simplicity of use and portability makes it a great tool for teaching purposes. PMID:29649301
Drawert, Brian; Engblom, Stefan; Hellander, Andreas
2012-06-22
Experiments in silico using stochastic reaction-diffusion models have emerged as an important tool in molecular systems biology. Designing computational software for such applications poses several challenges. Firstly, realistic lattice-based modeling for biological applications requires a consistent way of handling complex geometries, including curved inner- and outer boundaries. Secondly, spatiotemporal stochastic simulations are computationally expensive due to the fast time scales of individual reaction- and diffusion events when compared to the biological phenomena of actual interest. We therefore argue that simulation software needs to be both computationally efficient, employing sophisticated algorithms, yet in the same time flexible in order to meet present and future needs of increasingly complex biological modeling. We have developed URDME, a flexible software framework for general stochastic reaction-transport modeling and simulation. URDME uses Unstructured triangular and tetrahedral meshes to resolve general geometries, and relies on the Reaction-Diffusion Master Equation formalism to model the processes under study. An interface to a mature geometry and mesh handling external software (Comsol Multiphysics) provides for a stable and interactive environment for model construction. The core simulation routines are logically separated from the model building interface and written in a low-level language for computational efficiency. The connection to the geometry handling software is realized via a Matlab interface which facilitates script computing, data management, and post-processing. For practitioners, the software therefore behaves much as an interactive Matlab toolbox. At the same time, it is possible to modify and extend URDME with newly developed simulation routines. Since the overall design effectively hides the complexity of managing the geometry and meshes, this means that newly developed methods may be tested in a realistic setting already at an early stage of development. In this paper we demonstrate, in a series of examples with high relevance to the molecular systems biology community, that the proposed software framework is a useful tool for both practitioners and developers of spatial stochastic simulation algorithms. Through the combined efforts of algorithm development and improved modeling accuracy, increasingly complex biological models become feasible to study through computational methods. URDME is freely available at http://www.urdme.org.
The Application of SNiPER to the JUNO Simulation
NASA Astrophysics Data System (ADS)
Lin, Tao; Zou, Jiaheng; Li, Weidong; Deng, Ziyan; Fang, Xiao; Cao, Guofu; Huang, Xingtao; You, Zhengyun; JUNO Collaboration
2017-10-01
The JUNO (Jiangmen Underground Neutrino Observatory) is a multipurpose neutrino experiment which is designed to determine neutrino mass hierarchy and precisely measure oscillation parameters. As one of the important systems, the JUNO offline software is being developed using the SNiPER software. In this proceeding, we focus on the requirements of JUNO simulation and present the working solution based on the SNiPER. The JUNO simulation framework is in charge of managing event data, detector geometries and materials, physics processes, simulation truth information etc. It glues physics generator, detector simulation and electronics simulation modules together to achieve a full simulation chain. In the implementation of the framework, many attractive characteristics of the SNiPER have been used, such as dynamic loading, flexible flow control, multiple event management and Python binding. Furthermore, additional efforts have been made to make both detector and electronics simulation flexible enough to accommodate and optimize different detector designs. For the Geant4-based detector simulation, each sub-detector component is implemented as a SNiPER tool which is a dynamically loadable and configurable plugin. So it is possible to select the detector configuration at runtime. The framework provides the event loop to drive the detector simulation and interacts with the Geant4 which is implemented as a passive service. All levels of user actions are wrapped into different customizable tools, so that user functions can be easily extended by just adding new tools. The electronics simulation has been implemented by following an event driven scheme. The SNiPER task component is used to simulate data processing steps in the electronics modules. The electronics and trigger are synchronized by triggered events containing possible physics signals. The JUNO simulation software has been released and is being used by the JUNO collaboration to do detector design optimization, event reconstruction algorithm development and physics sensitivity studies.
A New Simulation Framework for the Electron-Ion Collider
NASA Astrophysics Data System (ADS)
Arrington, John
2017-09-01
Last year, a collaboration between Physics Division and High-Energy Physics at Argonne was formed to enable significantly broader contributions to the development of the Electron-Ion Collider. This includes efforts in accelerator R&D, theory, simulations, and detector R&D. I will give a brief overview of the status of these efforts, with emphasis on the aspects aimed at enabling the community to more easily become involved in evaluation of physics, detectors, and details of spectrometer designs. We have put together a new, easy-to-use simulation framework using flexible software tools. The goal is to enable detailed simulations to evaluate detector performance and compare detector designs. In addition, a common framework capable of providing detailed simulations of different spectrometer designs will allow for fully consistent evaluations of the physics reach of different spectrometer designs or detector systems for a variety of physics channels. In addition, new theory efforts will provide self-consistent models of GPDs (including QCD evolution) and TMDs in nucleons and light nuclei, as well as providing more detailed physics input for the evaluation of some new observables. This material is based upon work supported by Laboratory Directed Research and Development (LDRD) funding from Argonne National Laboratory, provided by the Director, Office of Science, of the U.S. Department of Energy under Contract DE-AC02-06CH11357.
NASA Astrophysics Data System (ADS)
Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.
2010-04-01
Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are meant to be developed, in order to design activities that target these capabilities. The challenges of using simulation environments effectively are especially daunting in dispersed social systems. This article describes how these challenges were addressed in the context of the Cisco Networking Academies with a simulation tool for computer networks called Packet Tracer. The focus is on a conceptual support framework for instructors in over 9,000 institutions around the world for using Packet Tracer in instruction and assessment, by learning to create problem-solving scenarios that are at once tuned to the local needs of their students and consistent with the epistemic frame of "thinking like a network engineer." We describe a layered framework of tools and interfaces above the network simulator that supports the use of Packet Tracer in the distributed community of instructors and students.
A Step-by-Step Framework on Discrete Events Simulation in Emergency Department; A Systematic Review.
Dehghani, Mahsa; Moftian, Nazila; Rezaei-Hachesu, Peyman; Samad-Soltani, Taha
2017-04-01
To systematically review the current literature of simulation in healthcare including the structured steps in the emergency healthcare sector by proposing a framework for simulation in the emergency department. For the purpose of collecting the data, PubMed and ACM databases were used between the years 2003 and 2013. The inclusion criteria were to select English-written articles available in full text with the closest objectives from among a total of 54 articles retrieved from the databases. Subsequently, 11 articles were selected for further analysis. The studies focused on the reduction of waiting time and patient stay, optimization of resources allocation, creation of crisis and maximum demand scenarios, identification of overcrowding bottlenecks, investigation of the impact of other systems on the existing system, and improvement of the system operations and functions. Subsequently, 10 simulation steps were derived from the relevant studies after an expert's evaluation. The 10-steps approach proposed on the basis of the selected studies provides simulation and planning specialists with a structured method for both analyzing problems and choosing best-case scenarios. Moreover, following this framework systematically enables the development of design processes as well as software implementation of simulation problems.
Shakhawath Hossain, Md; Bergstrom, D J; Chen, X B
2015-12-01
The in vitro chondrocyte cell culture for cartilage tissue regeneration in a perfusion bioreactor is a complex process. Mathematical modeling and computational simulation can provide important insights into the culture process, which would be helpful for selecting culture conditions to improve the quality of the developed tissue constructs. However, simulation of the cell culture process is a challenging task due to the complicated interaction between the cells and local fluid flow and nutrient transport inside the complex porous scaffolds. In this study, a mathematical model and computational framework has been developed to simulate the three-dimensional (3D) cell growth in a porous scaffold placed inside a bi-directional flow perfusion bioreactor. The model was developed by taking into account the two-way coupling between the cell growth and local flow field and associated glucose concentration, and then used to perform a resolved-scale simulation based on the lattice Boltzmann method (LBM). The simulation predicts the local shear stress, glucose concentration, and 3D cell growth inside the porous scaffold for a period of 30 days of cell culture. The predicted cell growth rate was in good overall agreement with the experimental results available in the literature. This study demonstrates that the bi-directional flow perfusion culture system can enhance the homogeneity of the cell growth inside the scaffold. The model and computational framework developed is capable of providing significant insight into the culture process, thus providing a powerful tool for the design and optimization of the cell culture process. © 2015 Wiley Periodicals, Inc.
The Application of Modeling and Simulation in Capacity Management within the ITIL Framework
NASA Technical Reports Server (NTRS)
Rahmani, Sonya; vonderHoff, Otto
2010-01-01
Tightly integrating modeling and simulation techniques into Information Technology Infrastructure Library (ITIL) practices can be one of the driving factors behind a successful and cost-effective capacity management effort for any Information Technology (IT) system. ITIL is a best practices framework for managing IT infrastructure, development and operations. Translating ITIL theory into operational reality can be a challenge. This paper aims to highlight how to best integrate modeling and simulation into an ITIL implementation. For cases where the project team initially has difficulty gaining consensus on investing in modeling and simulation resources, a clear definition for M&S implementation into the ITIL framework, specifically its role in supporting Capacity Management, is critical to gaining the support required to garner these resources. This implementation should also help to clearly define M&S support to the overall system mission. This paper will describe the development of an integrated modeling approach and how best to tie M&S to definitive goals for evaluating system capacity and performance requirements. Specifically the paper will discuss best practices for implementing modeling and simulation into ITIL. These practices hinge on implementing integrated M&S methods that 1) encompass at least two or more predictive modeling techniques, 2) complement each one's respective strengths and weaknesses to support the validation of predicted results, and 3) are tied to the system's performance and workload monitoring efforts. How to structure two forms of modeling: statistical and simUlation in the development of "As Is" and "To Be" efforts will be used to exemplify the integrated M&S methods. The paper will show how these methods can better support the project's overall capacity management efforts.
A Hardware-in-the-Loop Simulator for Software Development for a Mars Airplane
NASA Technical Reports Server (NTRS)
Slagowski, Stefan E.; Vican, Justin E.; Kenney, P. Sean
2007-01-01
Draper Laboratory recently developed a Hardware-In-The-Loop Simulator (HILSIM) to provide a simulation of the Aerial Regional-scale Environmental Survey (ARES) airplane executing a mission in the Martian environment. The HILSIM was used to support risk mitigation activities under the Planetary Airplane Risk Reduction (PARR) program. PARR supported NASA Langley Research Center's (LaRC) ARES proposal efforts for the Mars Scout 2011 opportunity. The HILSIM software was a successful integration of two simulation frameworks, Draper's CSIM and NASA LaRC's Langley Standard Real-Time Simulation in C++ (LaSRS++).
Quantification of uncertainties for application in detonation simulation
NASA Astrophysics Data System (ADS)
Zheng, Miao; Ma, Zhibo
2016-06-01
Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
Integrated Modeling, Mapping, and Simulation (IMMS) Framework for Exercise and Response Planning
NASA Technical Reports Server (NTRS)
Mapar, Jalal; Hoette, Trisha; Mahrous, Karim; Pancerella, Carmen M.; Plantenga, Todd; Yang, Christine; Yang, Lynn; Hopmeier, Michael
2011-01-01
EmergenCy management personnel at federal, stale, and local levels can benefit from the increased situational awareness and operational efficiency afforded by simulation and modeling for emergency preparedness, including planning, training and exercises. To support this goal, the Department of Homeland Security's Science & Technology Directorate is funding the Integrated Modeling, Mapping, and Simulation (IMMS) program to create an integrating framework that brings together diverse models for use by the emergency response community. SUMMIT, one piece of the IMMS program, is the initial software framework that connects users such as emergency planners and exercise developers with modeling resources, bridging the gap in expertise and technical skills between these two communities. SUMMIT was recently deployed to support exercise planning for National Level Exercise 2010. Threat, casualty. infrastructure, and medical surge models were combined within SUMMIT to estimate health care resource requirements for the exercise ground truth.
A numerical framework for the direct simulation of dense particulate flow under explosive dispersal
NASA Astrophysics Data System (ADS)
Mo, H.; Lien, F.-S.; Zhang, F.; Cronin, D. S.
2018-05-01
In this paper, we present a Cartesian grid-based numerical framework for the direct simulation of dense particulate flow under explosive dispersal. This numerical framework is established through the integration of the following numerical techniques: (1) operator splitting for partitioned fluid-solid interaction in the time domain, (2) the second-order SSP Runge-Kutta method and third-order WENO scheme for temporal and spatial discretization of governing equations, (3) the front-tracking method for evolving phase interfaces, (4) a field function proposed for low-memory-cost multimaterial mesh generation and fast collision detection, (5) an immersed boundary method developed for treating arbitrarily irregular and changing boundaries, and (6) a deterministic multibody contact and collision model. Employing the developed framework, this paper further studies particle jet formation under explosive dispersal by considering the effects of particle properties, particulate payload morphologies, and burster pressures. By the simulation of the dispersal processes of dense particle systems driven by pressurized gas, in which the driver pressure reaches 1.01325× 10^{10} Pa (10^5 times the ambient pressure) and particles are impulsively accelerated from stationary to a speed that is more than 12000 m/s within 15 μ s, it is demonstrated that the presented framework is able to effectively resolve coupled shock-shock, shock-particle, and particle-particle interactions in complex fluid-solid systems with shocked flow conditions, arbitrarily irregular particle shapes, and realistic multibody collisions.
Examining Reuse in LaSRS++-Based Projects
NASA Technical Reports Server (NTRS)
Madden, Michael M.
2001-01-01
NASA Langley Research Center (LaRC) developed the Langley Standard Real-Time Simulation in C++ (LaSRS++) to consolidate all software development for its simulation facilities under one common framework. A common framework promised a decrease in the total development effort for a new simulation by encouraging software reuse. To judge the success of LaSRS++ in this regard, reuse metrics were extracted from 11 aircraft models. Three methods that employ static analysis of the code were used to identify the reusable components. For the method that provides the best estimate, reuse levels fall between 66% and 95% indicating a high degree of reuse. Additional metrics provide insight into the extent of the foundation that LaSRS++ provides to new simulation projects. When creating variants of an aircraft, LaRC developers use object-oriented design to manage the aircraft as a reusable resource. Variants modify the aircraft for a research project or embody an alternate configuration of the aircraft. The variants inherit from the aircraft model. The variants use polymorphism to extend or redefine aircraft behaviors to meet the research requirements or to match the alternate configuration. Reuse level metrics were extracted from 10 variants. Reuse levels of aircraft by variants were 60% - 99%.
Virtual Plant Tissue: Building Blocks for Next-Generation Plant Growth Simulation
De Vos, Dirk; Dzhurakhalov, Abdiravuf; Stijven, Sean; Klosiewicz, Przemyslaw; Beemster, Gerrit T. S.; Broeckhove, Jan
2017-01-01
Motivation: Computational modeling of plant developmental processes is becoming increasingly important. Cellular resolution plant tissue simulators have been developed, yet they are typically describing physiological processes in an isolated way, strongly delimited in space and time. Results: With plant systems biology moving toward an integrative perspective on development we have built the Virtual Plant Tissue (VPTissue) package to couple functional modules or models in the same framework and across different frameworks. Multiple levels of model integration and coordination enable combining existing and new models from different sources, with diverse options in terms of input/output. Besides the core simulator the toolset also comprises a tissue editor for manipulating tissue geometry and cell, wall, and node attributes in an interactive manner. A parameter exploration tool is available to study parameter dependence of simulation results by distributing calculations over multiple systems. Availability: Virtual Plant Tissue is available as open source (EUPL license) on Bitbucket (https://bitbucket.org/vptissue/vptissue). The project has a website https://vptissue.bitbucket.io. PMID:28523006
Bennett, Casey C; Hauser, Kris
2013-01-01
In the modern healthcare system, rapidly expanding costs/complexity, the growing myriad of treatment options, and exploding information streams that often do not effectively reach the front lines hinder the ability to choose optimal treatment decisions over time. The goal in this paper is to develop a general purpose (non-disease-specific) computational/artificial intelligence (AI) framework to address these challenges. This framework serves two potential functions: (1) a simulation environment for exploring various healthcare policies, payment methodologies, etc., and (2) the basis for clinical artificial intelligence - an AI that can "think like a doctor". This approach combines Markov decision processes and dynamic decision networks to learn from clinical data and develop complex plans via simulation of alternative sequential decision paths while capturing the sometimes conflicting, sometimes synergistic interactions of various components in the healthcare system. It can operate in partially observable environments (in the case of missing observations or data) by maintaining belief states about patient health status and functions as an online agent that plans and re-plans as actions are performed and new observations are obtained. This framework was evaluated using real patient data from an electronic health record. The results demonstrate the feasibility of this approach; such an AI framework easily outperforms the current treatment-as-usual (TAU) case-rate/fee-for-service models of healthcare. The cost per unit of outcome change (CPUC) was $189 vs. $497 for AI vs. TAU (where lower is considered optimal) - while at the same time the AI approach could obtain a 30-35% increase in patient outcomes. Tweaking certain AI model parameters could further enhance this advantage, obtaining approximately 50% more improvement (outcome change) for roughly half the costs. Given careful design and problem formulation, an AI simulation framework can approximate optimal decisions even in complex and uncertain environments. Future work is described that outlines potential lines of research and integration of machine learning algorithms for personalized medicine. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Warsta, L.; Karvonen, T.
2017-12-01
There are currently 25 shooting and training areas in Finland managed by The Finnish Defence Forces (FDF), where military activities can cause contamination of open waters and groundwater reservoirs. In the YMPYRÄ project, a computer software framework is being developed that combines existing open environmental data and proprietary information collected by FDF with computational models to investigate current and prevent future environmental problems. A data centric philosophy is followed in the development of the system, i.e. the models are updated and extended to handle available data from different areas. The results generated by the models are summarized as easily understandable flow and risk maps that can be opened in GIS programs and used in environmental assessments by experts. Substances investigated with the system include explosives and metals such as lead, and both surface and groundwater dominated areas can be simulated. The YMPYRÄ framework is composed of a three dimensional soil and groundwater flow model, several solute transport models and an uncertainty assessment system. Solute transport models in the framework include particle based, stream tube and finite volume based approaches. The models can be used to simulate solute dissolution from source area, transport in the unsaturated layers to groundwater and finally migration in groundwater to water extraction wells and springs. The models can be used to simulate advection, dispersion, equilibrium adsorption on soil particles, solubility and dissolution from solute phase and dendritic solute decay chains. Correct numerical solutions were confirmed by comparing results to analytical 1D and 2D solutions and by comparing the numerical solutions to each other. The particle based and stream tube type solute transport models were useful as they could complement the traditional finite volume based approach which in certain circumstances produced numerical dispersion due to piecewise solution of the governing equations in computational grids and included computationally intensive and in some cases unstable iterative solutions. The YMPYRÄ framework is being developed by WaterHope, Gain Oy, and SITO Oy consulting companies and funded by FDF.
Real-time simulation of combined short-wave and long-wave infrared vision on a head-up display
NASA Astrophysics Data System (ADS)
Peinecke, Niklas; Schmerwitz, Sven
2014-05-01
Landing under adverse weather conditions can be challenging, even if the airfields are well known to the pilots. This is true for civil as well as military aviation. Within the scope of this paper we concentrate especially on fog conditions. The work has been conducted within the project ALICIA. ALICIA is a research and development project co-funded by European Commission under the Seventh Framework Programme. ALICIA aims at developing new and scalable cockpit applications which can extend operations of aircraft in degraded conditions: All Conditions Operations. One of the systems developed is a head-up display that can display a generated symbology together with a raster-mode infrared image. We will detail how we implemented a real-time enabled simulation of a combined short-wave and long-wave infrared image for landing. A major challenge was to integrate several already existing simulation solutions, e.g., for visual simulation and sensors with the required data-bases. For the simulations DLRs in-house sensor simulation framework F3S was used, together with a commercially available airport model that had to be heavily modified in order to provide realistic infrared data. Special effort was invested for a realistic impression of runway lighting under foggy conditions. We will present results and sketch further improvements for future simulations.
Drawert, Brian; Lawson, Michael J; Petzold, Linda; Khammash, Mustafa
2010-02-21
We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm.
2013-09-01
which utilizes FTA and then loads it into a DES engine to generate simulation results. .......44 Figure 21. This simulation architecture is...While Discrete Event Simulation ( DES ) can provide accurate time estimation and fast simulation speed, models utilizing it often suffer...C4ISR progress in MDW is developed in this research to demonstrate the feasibility of AEMF- DES and explore its potential. The simulation (MDSIM
ERIC Educational Resources Information Center
Wilson, Joe M.
2013-01-01
This dissertation uses design science research and engineering to develop a cloud-based simulator for modeling next-generation cybersecurity protection frameworks in the United States. The claim is made that an agile and neutral framework extending throughout the cyber-threat plane is needed for critical infrastructure protection (CIP). This…
The QuakeSim Project: Numerical Simulations for Active Tectonic Processes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry
2004-01-01
In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.
Open-source framework for power system transmission and distribution dynamics co-simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Fan, Rui; Daily, Jeff
The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less
2014-04-01
laparoscopic ventral hernia repair. Additional simulation stations were added to the standards and purchases (including a motion tracking system) were...framework for laparoscopic ventral hernia; Incorporation of error-based simulators into an exit assessment of chief surgical residents; Development of...simulating a laparoscopic ventral hernia (LVH) repair. Based on collected data, the lab worked to finalize the incorporation of error-based simulators
NASA Astrophysics Data System (ADS)
Konstantopoulos, Nikolaos; Trivellas, Panagiotis; Reklitis, Panagiotis
2007-12-01
According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their failure to foster innovative behaviour by aligning business strategy and structure. From this point of view, the fit between strategy and structure is essential in order to facilitate firms' innovative behaviour. In the present paper, we adopt Porter's typology to operationalise business strategy (cost leadership, innovative and marketing differentiation, and focus). Organizational structure is built on four dimensions (centralization, formalization, complexity and employees' initiatives to implement new ideas). Innovativeness is measured as product innovation, process and technological innovation. This study provides the necessary theoretical framework for the development of a dynamic simulation method, although the simulation of social events is a quite difficult task, considering that there are so many alternatives (not all well understood).
System Software Framework for System of Systems Avionics
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.
2005-01-01
Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.
NASA Astrophysics Data System (ADS)
Divel, Sarah E.; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.
2017-03-01
Computer simulation is a powerful tool in CT; however, long simulation times of complex phantoms and systems, especially when modeling many physical aspects (e.g., spectrum, finite detector and source size), hinder the ability to realistically and efficiently evaluate and optimize CT techniques. Long simulation times primarily result from the tracing of hundreds of line integrals through each of the hundreds of geometrical shapes defined within the phantom. However, when the goal is to perform dynamic simulations or test many scan protocols using a particular phantom, traditional simulation methods inefficiently and repeatedly calculate line integrals through the same set of structures although only a few parameters change in each new case. In this work, we have developed a new simulation framework that overcomes such inefficiencies by dividing the phantom into material specific regions with the same time attenuation profiles, acquiring and storing monoenergetic projections of the regions, and subsequently scaling and combining the projections to create equivalent polyenergetic sinograms. The simulation framework is especially efficient for the validation and optimization of CT perfusion which requires analysis of many stroke cases and testing hundreds of scan protocols on a realistic and complex numerical brain phantom. Using this updated framework to conduct a 31-time point simulation with 80 mm of z-coverage of a brain phantom on two 16-core Linux serves, we have reduced the simulation time from 62 hours to under 2.6 hours, a 95% reduction.
A New Simulation Framework for Autonomy in Robotic Missions
NASA Technical Reports Server (NTRS)
Flueckiger, Lorenzo; Neukom, Christian
2003-01-01
Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.
Using computer simulations to facilitate conceptual understanding of electromagnetic induction
NASA Astrophysics Data System (ADS)
Lee, Yu-Fen
This study investigated the use of computer simulations to facilitate conceptual understanding in physics. The use of computer simulations in the present study was grounded in a conceptual framework drawn from findings related to the use of computer simulations in physics education. To achieve the goal of effective utilization of computers for physics education, I first reviewed studies pertaining to computer simulations in physics education categorized by three different learning frameworks and studies comparing the effects of different simulation environments. My intent was to identify the learning context and factors for successful use of computer simulations in past studies and to learn from the studies which did not obtain a significant result. Based on the analysis of reviewed literature, I proposed effective approaches to integrate computer simulations in physics education. These approaches are consistent with well established education principles such as those suggested by How People Learn (Bransford, Brown, Cocking, Donovan, & Pellegrino, 2000). The research based approaches to integrated computer simulations in physics education form a learning framework called Concept Learning with Computer Simulations (CLCS) in the current study. The second component of this study was to examine the CLCS learning framework empirically. The participants were recruited from a public high school in Beijing, China. All participating students were randomly assigned to two groups, the experimental (CLCS) group and the control (TRAD) group. Research based computer simulations developed by the physics education research group at University of Colorado at Boulder were used to tackle common conceptual difficulties in learning electromagnetic induction. While interacting with computer simulations, CLCS students were asked to answer reflective questions designed to stimulate qualitative reasoning and explanation. After receiving model reasoning online, students were asked to submit their revised answers electronically. Students in the TRAD group were not granted access to the CLCS material and followed their normal classroom routine. At the end of the study, both the CLCS and TRAD students took a post-test. Questions on the post-test were divided into "what" questions, "how" questions, and an open response question. Analysis of students' post-test performance showed mixed results. While the TRAD students scored higher on the "what" questions, the CLCS students scored higher on the "how" questions and the one open response questions. This result suggested that more TRAD students knew what kinds of conditions may or may not cause electromagnetic induction without understanding how electromagnetic induction works. Analysis of the CLCS students' learning also suggested that frequent disruption and technical trouble might pose threats to the effectiveness of the CLCS learning framework. Despite the mixed results of students' post-test performance, the CLCS learning framework revealed some limitations to promote conceptual understanding in physics. Improvement can be made by providing students with background knowledge necessary to understand model reasoning and incorporating the CLCS learning framework with other learning frameworks to promote integration of various physics concepts. In addition, the reflective questions in the CLCS learning framework may be refined to better address students' difficulties. Limitations of the study, as well as suggestions for future research, are also presented in this study.
Benefits of a Unified LaSRS++ Simulation for NAS-Wide and High-Fidelity Modeling
NASA Technical Reports Server (NTRS)
Glaab, Patricia; Madden, Michael
2014-01-01
The LaSRS++ high-fidelity vehicle simulation was extended in 2012 to support a NAS-wide simulation mode. Since the initial proof-of-concept, the LaSRS++ NAS-wide simulation is maturing into a research-ready tool. A primary benefit of this new capability is the consolidation of the two modeling paradigms under a single framework to save cost, facilitate iterative concept testing between the two tools, and to promote communication and model sharing between user communities at Langley. Specific benefits of each type of modeling are discussed along with the expected benefits of the unified framework. Current capability details of the LaSRS++ NAS-wide simulations are provided, including the visualization tool, live data interface, trajectory generators, terminal routing for arrivals and departures, maneuvering, re-routing, navigation, winds, and turbulence. The plan for future development is also described.
Agent Based Modeling of Air Carrier Behavior for Evaluation of Technology Equipage and Adoption
NASA Technical Reports Server (NTRS)
Horio, Brant M.; DeCicco, Anthony H.; Stouffer, Virginia L.; Hasan, Shahab; Rosenbaum, Rebecca L.; Smith, Jeremy C.
2014-01-01
As part of ongoing research, the National Aeronautics and Space Administration (NASA) and LMI developed a research framework to assist policymakers in identifying impacts on the U.S. air transportation system (ATS) of potential policies and technology related to the implementation of the Next Generation Air Transportation System (NextGen). This framework, called the Air Transportation System Evolutionary Simulation (ATS-EVOS), integrates multiple models into a single process flow to best simulate responses by U.S. commercial airlines and other ATS stakeholders to NextGen-related policies, and in turn, how those responses impact the ATS. Development of this framework required NASA and LMI to create an agent-based model of airline and passenger behavior. This Airline Evolutionary Simulation (AIRLINE-EVOS) models airline decisions about tactical airfare and schedule adjustments, and strategic decisions related to fleet assignments, market prices, and equipage. AIRLINE-EVOS models its own heterogeneous population of passenger agents that interact with airlines; this interaction allows the model to simulate the cycle of action-reaction as airlines compete with each other and engage passengers. We validated a baseline configuration of AIRLINE-EVOS against Airline Origin and Destination Survey (DB1B) data and subject matter expert opinion, and we verified the ATS-EVOS framework and agent behavior logic through scenario-based experiments. These experiments demonstrated AIRLINE-EVOS's capabilities in responding to an input price shock in fuel prices, and to equipage challenges in a series of analyses based on potential incentive policies for best equipped best served, optimal-wind routing, and traffic management initiative exemption concepts..
Tran-Duy, An; Boonen, Annelies; van de Laar, Mart A F J; Franke, Angelinus C; Severens, Johan L
2011-12-01
To develop a modelling framework which can simulate long-term quality of life, societal costs and cost-effectiveness as affected by sequential drug treatment strategies for ankylosing spondylitis (AS). Discrete event simulation paradigm was selected for model development. Drug efficacy was modelled as changes in disease activity (Bath Ankylosing Spondylitis Disease Activity Index (BASDAI)) and functional status (Bath Ankylosing Spondylitis Functional Index (BASFI)), which were linked to costs and health utility using statistical models fitted based on an observational AS cohort. Published clinical data were used to estimate drug efficacy and time to events. Two strategies were compared: (1) five available non-steroidal anti-inflammatory drugs (strategy 1) and (2) same as strategy 1 plus two tumour necrosis factor α inhibitors (strategy 2). 13,000 patients were followed up individually until death. For probability sensitivity analysis, Monte Carlo simulations were performed with 1000 sets of parameters sampled from the appropriate probability distributions. The models successfully generated valid data on treatments, BASDAI, BASFI, utility, quality-adjusted life years (QALYs) and costs at time points with intervals of 1-3 months during the simulation length of 70 years. Incremental cost per QALY gained in strategy 2 compared with strategy 1 was €35,186. At a willingness-to-pay threshold of €80,000, it was 99.9% certain that strategy 2 was cost-effective. The modelling framework provides great flexibility to implement complex algorithms representing treatment selection, disease progression and changes in costs and utilities over time of patients with AS. Results obtained from the simulation are plausible.
The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic
NASA Technical Reports Server (NTRS)
Armstrong, Curtis D.; Humphreys, William M.
2003-01-01
We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.
NASA Astrophysics Data System (ADS)
Dolly, Steven R.; Anastasio, Mark A.; Yu, Lifeng; Li, Hua
2017-03-01
In current radiation therapy practice, image quality is still assessed subjectively or by utilizing physically-based metrics. Recently, a methodology for objective task-based image quality (IQ) assessment in radiation therapy was proposed by Barrett et al.1 In this work, we present a comprehensive implementation and evaluation of this new IQ assessment methodology. A modular simulation framework was designed to perform an automated, computer-simulated end-to-end radiation therapy treatment. A fully simulated framework was created that utilizes new learning-based stochastic object models (SOM) to obtain known organ boundaries, generates a set of images directly from the numerical phantoms created with the SOM, and automates the image segmentation and treatment planning steps of a radiation therapy work ow. By use of this computational framework, therapeutic operating characteristic (TOC) curves can be computed and the area under the TOC curve (AUTOC) can be employed as a figure-of-merit to guide optimization of different components of the treatment planning process. The developed computational framework is employed to optimize X-ray CT pre-treatment imaging. We demonstrate that use of the radiation therapy-based-based IQ measures lead to different imaging parameters than obtained by use of physical-based measures.
A Biophysical Modeling Framework for Assessing the Environmental Impact of Biofuel Production
NASA Astrophysics Data System (ADS)
Zhang, X.; Izaurradle, C.; Manowitz, D.; West, T. O.; Post, W. M.; Thomson, A. M.; Nichols, J.; Bandaru, V.; Williams, J. R.
2009-12-01
Long-term sustainability of a biofuel economy necessitates environmentally friendly biofuel production systems. We describe a biophysical modeling framework developed to understand and quantify the environmental value and impact (e.g. water balance, nutrients balance, carbon balance, and soil quality) of different biomass cropping systems. This modeling framework consists of three major components: 1) a Geographic Information System (GIS) based data processing system, 2) a spatially-explicit biophysical modeling approach, and 3) a user friendly information distribution system. First, we developed a GIS to manage the large amount of geospatial data (e.g. climate, land use, soil, and hydrograhy) and extract input information for the biophysical model. Second, the Environmental Policy Integrated Climate (EPIC) biophysical model is used to predict the impact of various cropping systems and management intensities on productivity, water balance, and biogeochemical variables. Finally, a geo-database is developed to distribute the results of ecosystem service variables (e.g. net primary productivity, soil carbon balance, soil erosion, nitrogen and phosphorus losses, and N2O fluxes) simulated by EPIC for each spatial modeling unit online using PostgreSQL. We applied this framework in a Regional Intensive Management Area (RIMA) of 9 counties in Michigan. A total of 4,833 spatial units with relatively homogeneous biophysical properties were derived using SSURGO, Crop Data Layer, County, and 10-digit watershed boundaries. For each unit, EPIC was executed from 1980 to 2003 under 54 cropping scenarios (eg. corn, switchgrass, and hybrid poplar). The simulation results were compared with historical crop yields from USDA NASS. Spatial mapping of the results show high variability among different cropping scenarios in terms of the simulated ecosystem services variables. Overall, the framework developed in this study enables the incorporation of environmental factors into economic and life-cycle analysis in order to optimize biomass cropping production scenarios.
Playing To Learn: A Community Outreach Framework in Action.
ERIC Educational Resources Information Center
Ganzert, Robin; Helms, Allen
1998-01-01
Describes a partnership between Wake Forest University's graduate school of management and an elementary school that resulted in the development of a computer simulation game that integrated technology into the fifth-grade curriculum via a business simulation where players set up and run their own dinosaur amusement park. (LRW)
Advances in Chimera Grid Tools for Multi-Body Dynamics Simulations and Script Creation
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
This viewgraph presentation contains information about (1) Framework for multi-body dynamics - Geometry Manipulation Protocol (GMP), (2) Simulation procedure using Chimera Grid Tools (CGT) and OVERFLOW-2 (3) Further recent developments in Chimera Grid Tools OVERGRID, Grid modules, Script library and (4) Future work.
Design and Performance Frameworks for Constructing Problem-Solving Simulations
ERIC Educational Resources Information Center
Stevens, Rons; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks…
Continuous integration for concurrent MOOSE framework and application development on GitHub
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.; ...
2015-11-20
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Continuous integration for concurrent MOOSE framework and application development on GitHub
DOE Office of Scientific and Technical Information (OSTI.GOV)
Slaughter, Andrew E.; Peterson, John W.; Gaston, Derek R.
For the past several years, Idaho National Laboratory’s MOOSE framework team has employed modern software engineering techniques (continuous integration, joint application/framework source code repos- itories, automated regression testing, etc.) in developing closed-source multiphysics simulation software (Gaston et al., Journal of Open Research Software vol. 2, article e10, 2014). In March 2014, the MOOSE framework was released under an open source license on GitHub, significantly expanding and diversifying the pool of current active and potential future contributors on the project. Despite this recent growth, the same philosophy of concurrent framework and application development continues to guide the project’s development roadmap. Severalmore » specific practices, including techniques for managing multiple repositories, conducting automated regression testing, and implementing a cascading build process are discussed in this short paper. Furthermore, special attention is given to describing the manner in which these practices naturally synergize with the GitHub API and GitHub-specific features such as issue tracking, Pull Requests, and project forks.« less
Development of an OSSE Framework for a Global Atmospheric Data Assimilation System
NASA Technical Reports Server (NTRS)
Gelaro, Ronald; Errico, Ronald M.; Prive, N.
2012-01-01
Observing system simulation experiments (OSSEs) are powerful tools for estimating the usefulness of various configurations of envisioned observing systems and data assimilation techniques. Their utility stems from their being conducted in an entirely simulated context, utilizing simulated observations having simulated errors and drawn from a simulation of the earth's environment. Observations are generated by applying physically based algorithms to the simulated state, such as performed during data assimilation or using other appropriate algorithms. Adding realistic instrument plus representativeness errors, including their biases and correlations, can be critical for obtaining realistic assessments of the impact of a proposed observing system or analysis technique. If estimates of the expected accuracy of proposed observations are realistic, then the OSSE can be also used to learn how best to utilize the new information, accelerating its transition to operations once the real data are available. As with any inferences from simulations, however, it is first imperative that some baseline OSSEs are performed and well validated against corresponding results obtained with a real observing system. This talk provides an overview of, and highlights critical issues related to, the development of an OSSE framework for the tropospheric weather prediction component of the NASA GEOS-5 global atmospheric data assimilation system. The framework includes all existing observations having significant impact on short-term forecast skill. Its validity has been carefully assessed using a range of metrics that can be evaluated in both the OSSE and real contexts, including adjoint-based estimates of observation impact. A preliminary application to the Aeolus Doppler wind lidar mission, scheduled for launch by the European Space Agency in 2014, has also been investigated.
Simulation-Based Approach for Site-Specific Optimization of Hydrokinetic Turbine Arrays
NASA Astrophysics Data System (ADS)
Sotiropoulos, F.; Chawdhary, S.; Yang, X.; Khosronejad, A.; Angelidis, D.
2014-12-01
A simulation-based approach has been developed to enable site-specific optimization of tidal and current turbine arrays in real-life waterways. The computational code is based on the St. Anthony Falls Laboratory Virtual StreamLab (VSL3D), which is able to carry out high-fidelity simulations of turbulent flow and sediment transport processes in rivers and streams taking into account the arbitrary geometrical complexity characterizing natural waterways. The computational framework can be used either in turbine-resolving mode, to take into account all geometrical details of the turbine, or with the turbines parameterized as actuator disks or actuator lines. Locally refined grids are employed to dramatically increase the resolution of the simulation and enable efficient simulations of multi-turbine arrays. Turbine/sediment interactions are simulated using the coupled hydro-morphodynamic module of VSL3D. The predictive capabilities of the resulting computational framework will be demonstrated by applying it to simulate turbulent flow past a tri-frame configuration of hydrokinetic turbines in a rigid-bed turbulent open channel flow as well as turbines mounted on mobile bed open channels to investigate turbine/sediment interactions. The utility of the simulation-based approach for guiding the optimal development of turbine arrays in real-life waterways will also be discussed and demonstrated. This work was supported by NSF grant IIP-1318201. Simulations were carried out at the Minnesota Supercomputing Institute.
A Step-by-Step Framework on Discrete Events Simulation in Emergency Department; A Systematic Review
Dehghani, Mahsa; Moftian, Nazila; Rezaei-Hachesu, Peyman; Samad-Soltani, Taha
2017-01-01
Objective: To systematically review the current literature of simulation in healthcare including the structured steps in the emergency healthcare sector by proposing a framework for simulation in the emergency department. Methods: For the purpose of collecting the data, PubMed and ACM databases were used between the years 2003 and 2013. The inclusion criteria were to select English-written articles available in full text with the closest objectives from among a total of 54 articles retrieved from the databases. Subsequently, 11 articles were selected for further analysis. Results: The studies focused on the reduction of waiting time and patient stay, optimization of resources allocation, creation of crisis and maximum demand scenarios, identification of overcrowding bottlenecks, investigation of the impact of other systems on the existing system, and improvement of the system operations and functions. Subsequently, 10 simulation steps were derived from the relevant studies after an expert’s evaluation. Conclusion: The 10-steps approach proposed on the basis of the selected studies provides simulation and planning specialists with a structured method for both analyzing problems and choosing best-case scenarios. Moreover, following this framework systematically enables the development of design processes as well as software implementation of simulation problems. PMID:28507994
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei (OA)
2014-01-01
A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This presentation describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this presentation is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture.
A Qualitative Simulation Framework in Smalltalk Based on Fuzzy Arithmetic
Richard L. Olson; Daniel L. Schmoldt; David L. Peterson
1996-01-01
For many systems, it is not practical to collect and correlate empirical data necessary to formulate a mathematical model. However, it is often sufficient to predict qualitative dynamics effects (as opposed to system quantities), especially for research purposes. In this effort, an object-oriented application framework (AF) was developed for the qualitative modeling of...
A Program Structure for Event-Based Speech Synthesis by Rules within a Flexible Segmental Framework.
ERIC Educational Resources Information Center
Hill, David R.
1978-01-01
A program structure based on recently developed techniques for operating system simulation has the required flexibility for use as a speech synthesis algorithm research framework. This program makes synthesis possible with less rigid time and frequency-component structure than simpler schemes. It also meets real-time operation and memory-size…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.
The U.S. Department of Energy (DOE) has developed a vehicle framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to DOE’s Technical Targets using four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework model for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be easily estimated. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates the systems parameters required to run the storage system model. Additionally, this design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the framework model and compare it to the DOE Technical Targets. These models will be explained and exercised with existing hydrogen storage materials.« less
A multi-fidelity framework for physics based rotor blade simulation and optimization
NASA Astrophysics Data System (ADS)
Collins, Kyle Brian
New helicopter rotor designs are desired that offer increased efficiency, reduced vibration, and reduced noise. Rotor Designers in industry need methods that allow them to use the most accurate simulation tools available to search for these optimal designs. Computer based rotor analysis and optimization have been advanced by the development of industry standard codes known as "comprehensive" rotorcraft analysis tools. These tools typically use table look-up aerodynamics, simplified inflow models and perform aeroelastic analysis using Computational Structural Dynamics (CSD). Due to the simplified aerodynamics, most design studies are performed varying structural related design variables like sectional mass and stiffness. The optimization of shape related variables in forward flight using these tools is complicated and results are viewed with skepticism because rotor blade loads are not accurately predicted. The most accurate methods of rotor simulation utilize Computational Fluid Dynamics (CFD) but have historically been considered too computationally intensive to be used in computer based optimization, where numerous simulations are required. An approach is needed where high fidelity CFD rotor analysis can be utilized in a shape variable optimization problem with multiple objectives. Any approach should be capable of working in forward flight in addition to hover. An alternative is proposed and founded on the idea that efficient hybrid CFD methods of rotor analysis are ready to be used in preliminary design. In addition, the proposed approach recognizes the usefulness of lower fidelity physics based analysis and surrogate modeling. Together, they are used with high fidelity analysis in an intelligent process of surrogate model building of parameters in the high fidelity domain. Closing the loop between high and low fidelity analysis is a key aspect of the proposed approach. This is done by using information from higher fidelity analysis to improve predictions made with lower fidelity models. This thesis documents the development of automated low and high fidelity physics based rotor simulation frameworks. The low fidelity framework uses a comprehensive code with simplified aerodynamics. The high fidelity model uses a parallel processor capable CFD/CSD methodology. Both low and high fidelity frameworks include an aeroacoustic simulation for prediction of noise. A synergistic process is developed that uses both the low and high fidelity frameworks together to build approximate models of important high fidelity metrics as functions of certain design variables. To test the process, a 4-bladed hingeless rotor model is used as a baseline. The design variables investigated include tip geometry and spanwise twist distribution. Approximation models are built for metrics related to rotor efficiency and vibration using the results from 60+ high fidelity (CFD/CSD) experiments and 400+ low fidelity experiments. Optimization using the approximation models found the Pareto Frontier anchor points, or the design having maximum rotor efficiency and the design having minimum vibration. Various Pareto generation methods are used to find designs on the frontier between these two anchor designs. When tested in the high fidelity framework, the Pareto anchor designs are shown to be very good designs when compared with other designs from the high fidelity database. This provides evidence that the process proposed has merit. Ultimately, this process can be utilized by industry rotor designers with their existing tools to bring high fidelity analysis into the preliminary design stage of rotors. In conclusion, the methods developed and documented in this thesis have made several novel contributions. First, an automated high fidelity CFD based forward flight simulation framework has been built for use in preliminary design optimization. The framework was built around an integrated, parallel processor capable CFD/CSD/AA process. Second, a novel method of building approximate models of high fidelity parameters has been developed. The method uses a combination of low and high fidelity results and combines Design of Experiments, statistical effects analysis, and aspects of approximation model management. And third, the determination of rotor blade shape variables through optimization using CFD based analysis in forward flight has been performed. This was done using the high fidelity CFD/CSD/AA framework and method mentioned above. While the low and high fidelity predictions methods used in the work still have inaccuracies that can affect the absolute levels of the results, a framework has been successfully developed and demonstrated that allows for an efficient process to improve rotor blade designs in terms of a selected choice of objective function(s). Using engineering judgment, this methodology could be applied today to investigate opportunities to improve existing designs. With improvements in the low and high fidelity prediction components that will certainly occur, this framework could become a powerful tool for future rotorcraft design work. (Abstract shortened by UMI.)
NASA Astrophysics Data System (ADS)
Gao, Xiatian; Wang, Xiaogang; Jiang, Binhao
2017-10-01
UPSF (Universal Plasma Simulation Framework) is a new plasma simulation code designed for maximum flexibility by using edge-cutting techniques supported by C++17 standard. Through use of metaprogramming technique, UPSF provides arbitrary dimensional data structures and methods to support various kinds of plasma simulation models, like, Vlasov, particle in cell (PIC), fluid, Fokker-Planck, and their variants and hybrid methods. Through C++ metaprogramming technique, a single code can be used to arbitrary dimensional systems with no loss of performance. UPSF can also automatically parallelize the distributed data structure and accelerate matrix and tensor operations by BLAS. A three-dimensional particle in cell code is developed based on UPSF. Two test cases, Landau damping and Weibel instability for electrostatic and electromagnetic situation respectively, are presented to show the validation and performance of the UPSF code.
A novel medical image data-based multi-physics simulation platform for computational life sciences.
Neufeld, Esra; Szczerba, Dominik; Chavannes, Nicolas; Kuster, Niels
2013-04-06
Simulating and modelling complex biological systems in computational life sciences requires specialized software tools that can perform medical image data-based modelling, jointly visualize the data and computational results, and handle large, complex, realistic and often noisy anatomical models. The required novel solvers must provide the power to model the physics, biology and physiology of living tissue within the full complexity of the human anatomy (e.g. neuronal activity, perfusion and ultrasound propagation). A multi-physics simulation platform satisfying these requirements has been developed for applications including device development and optimization, safety assessment, basic research, and treatment planning. This simulation platform consists of detailed, parametrized anatomical models, a segmentation and meshing tool, a wide range of solvers and optimizers, a framework for the rapid development of specialized and parallelized finite element method solvers, a visualization toolkit-based visualization engine, a Python scripting interface for customized applications, a coupling framework, and more. Core components are cross-platform compatible and use open formats. Several examples of applications are presented: hyperthermia cancer treatment planning, tumour growth modelling, evaluating the magneto-haemodynamic effect as a biomarker and physics-based morphing of anatomical models.
NASA Technical Reports Server (NTRS)
Liever, Peter A.; West, Jeffrey S.; Harris, Robert E.
2016-01-01
A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed for launch vehicle liftoff acoustic environment predictions. The framework couples the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate Discontinuous Galerkin solver developed in the same production framework, Loci/THRUST, to accurately resolve and propagate acoustic physics across the entire launch environment. Time-accurate, Hybrid RANS/LES CFD modeling is applied for predicting the acoustic generation physics at the plume source, and a high-order accurate unstructured mesh Discontinuous Galerkin (DG) method is employed to propagate acoustic waves away from the source across large distances using high-order accurate schemes. The DG solver is capable of solving 2nd, 3rd, and 4th order Euler solutions for non-linear, conservative acoustic field propagation. Initial application testing and validation has been carried out against high resolution acoustic data from the Ares Scale Model Acoustic Test (ASMAT) series to evaluate the capabilities and production readiness of the CFD/CAA system to resolve the observed spectrum of acoustic frequency content. This paper presents results from this validation and outlines efforts to mature and improve the computational simulation framework.
A Level-set based framework for viscous simulation of particle-laden supersonic flows
NASA Astrophysics Data System (ADS)
Das, Pratik; Sen, Oishik; Jacobs, Gustaaf; Udaykumar, H. S.
2017-06-01
Particle-laden supersonic flows are important in natural and industrial processes, such as, volcanic eruptions, explosions, pneumatic conveyance of particle in material processing etc. Numerical study of such high-speed particle laden flows at the mesoscale calls for a numerical framework which allows simulation of supersonic flow around multiple moving solid objects. Only a few efforts have been made toward development of numerical frameworks for viscous simulation of particle-fluid interaction in supersonic flow regime. The current work presents a Cartesian grid based sharp-interface method for viscous simulations of interaction between supersonic flow with moving rigid particles. The no-slip boundary condition is imposed at the solid-fluid interfaces using a modified ghost fluid method (GFM). The current method is validated against the similarity solution of compressible boundary layer over flat-plate and benchmark numerical solution for steady supersonic flow over cylinder. Further validation is carried out against benchmark numerical results for shock induced lift-off of a cylinder in a shock tube. 3D simulation of steady supersonic flow over sphere is performed to compare the numerically obtained drag co-efficient with experimental results. A particle-resolved viscous simulation of shock interaction with a cloud of particles is performed to demonstrate that the current method is suitable for large-scale particle resolved simulations of particle-laden supersonic flows.
Multiscale Simulations of Magnetic Island Coalescence
NASA Technical Reports Server (NTRS)
Dorelli, John C.
2010-01-01
We describe a new interactive parallel Adaptive Mesh Refinement (AMR) framework written in the Python programming language. This new framework, PyAMR, hides the details of parallel AMR data structures and algorithms (e.g., domain decomposition, grid partition, and inter-process communication), allowing the user to focus on the development of algorithms for advancing the solution of a systems of partial differential equations on a single uniform mesh. We demonstrate the use of PyAMR by simulating the pairwise coalescence of magnetic islands using the resistive Hall MHD equations. Techniques for coupling different physics models on different levels of the AMR grid hierarchy are discussed.
NASA Astrophysics Data System (ADS)
Peter, Jörg; Semmler, Wolfhard
2007-10-01
Alongside and in part motivated by recent advances in molecular diagnostics, the development of dual-modality instruments for patient and dedicated small animal imaging has gained attention by diverse research groups. The desire for such systems is high not only to link molecular or functional information with the anatomical structures, but also for detecting multiple molecular events simultaneously at shorter total acquisition times. While PET and SPECT have been integrated successfully with X-ray CT, the advance of optical imaging approaches (OT) and the integration thereof into existing modalities carry a high application potential, particularly for imaging small animals. A multi-modality Monte Carlo (MC) simulation approach at present has been developed that is able to trace high-energy (keV) as well as optical (eV) photons concurrently within identical phantom representation models. We show that the involved two approaches for ray-tracing keV and eV photons can be integrated into a unique simulation framework which enables both photon classes to be propagated through various geometry models representing both phantoms and scanners. The main advantage of such integrated framework for our specific application is the investigation of novel tomographic multi-modality instrumentation intended for in vivo small animal imaging through time-resolved MC simulation upon identical phantom geometries. Design examples are provided for recently proposed SPECT-OT and PET-OT imaging systems.
Numerical simulation of the fracture process in ceramic FPD frameworks caused by oblique loading.
Kou, Wen; Qiao, Jiyan; Chen, Li; Ding, Yansheng; Sjögren, Göran
2015-10-01
Using a newly developed three-dimensional (3D) numerical modeling code, an analysis was performed of the fracture behavior in a three-unit ceramic-based fixed partial denture (FPD) framework subjected to oblique loading. All the materials in the study were treated heterogeneously; Weibull׳s distribution law was applied to the description of the heterogeneity. The Mohr-Coulomb failure criterion with tensile strength cut-off was utilized in judging whether the material was in an elastic or failed state. The simulated loading area was placed either on the buccal or the lingual cusp of a premolar-shaped pontic with the loading direction at 30°, 45°, 60°, 75° or 90° angles to the occlusal surface. The stress distribution, fracture initiation and propagation in the framework during the loading and fracture process were analyzed. This numerical simulation allowed the cause of the framework fracture to be identified as tensile stress failure. The decisive fracture was initiated in the gingival embrasure of the pontic, regardless of whether the buccal or lingual cusp of the pontic was loaded. The stress distribution and fracture propagation process of the framework could be followed step by step from beginning to end. The bearing capacity and the rigidity of the framework vary with the loading position and direction. The framework loaded with 90° towards the occlusal surface has the highest bearing capacity and the greatest rigidity. The framework loaded with 30° towards the occlusal surface has the least rigidity indicating that oblique loading has a major impact on the fracture of ceramic frameworks. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nguyen, Tien M.; Guillen, Andy T.
2017-05-01
This paper describes static Bayesian game models with "Pure" and "Mixed" games for the development of an optimum Program and Technical Baseline (PTB) solution for affordable acquisition of future space systems. The paper discusses System Engineering (SE) frameworks and analytical and simulation modeling approaches for developing the optimum PTB solutions from both the government and contractor perspectives.
NASA Astrophysics Data System (ADS)
Nayak, Kapileswar; Das, Sushanta; Nanavati, Hemant
2008-01-01
We present a framework for the development of elasticity and photoelasticity relationships for polyethylene terephthalate fiber networks, incorporating aspects of the primary molecular structure. Semicrystalline polymeric fiber networks are modeled as sequentially arranged crystalline and amorphous regions. Rotational isomeric states-Monte Carlo simulations of amorphous chains of up to 360 bonds (degree of polymerization, DP =60), confined between and bridging infinite impenetrable crystalline walls, have been characterized by Ω, the probability density of the intercrystal separation h, and Δβ, the polarizability anisotropy. lnΩ and Δβ have been modeled as functions of h, yielding the chain deformation relationships. The development has been extended to the fiber network to yield the photoelasticity relationships. We execute our framework by fitting to experimental stress-elongation data and employing the single fitted parameter to directly predict the birefringence-elongation behavior, without any further fitting. Incorporating the effect of strain-induced crystallization into the framework makes it physically more meaningful and yields accurate predictions of the birefringence-elongation behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldemir, Tunc; Denning, Richard; Catalyurek, Umit
Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, suchmore » as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.« less
Simulation in interprofessional education for patient-centred collaborative care.
Baker, Cynthia; Pulling, Cheryl; McGraw, Robert; Dagnone, Jeffrey Damon; Hopkins-Rosseel, Diana; Medves, Jennifer
2008-11-01
This paper is a report of preliminary evaluations of an interprofessional education through simulation project by focusing on learner and teacher reactions to the pilot modules. Approaches to interprofessional education vary widely. Studies indicate, however, that active, experiential learning facilitate it. Patient simulators require learners to incorporate knowing, being and doing in action. A theoretically based competency framework was developed to guide interprofessional education using simulation. The framework includes a typology of shared, complementary and profession-specific competencies. Each competency type is associated with an intraprofessional, multiprofessional, or interprofessional teaching modality and with the professional composition of learner groups. The project is guided by an action research approach in which ongoing evaluation generates knowledge to modify and further develop it. Preliminary evaluations of the first pilot module, cardiac resuscitation rounds, among 101 nursing students, 42 medical students and 70 junior medical residents were conducted in 2005-2007 using a questionnaire with rating scales and open-ended questions. Another 20 medical students, 7 junior residents and 45 nursing students completed a questionnaire based on the Interdisciplinary Education Perception scale. Simulation-based learning provided students with interprofessional activities they saw as relevant for their future as practitioners. They embraced both the interprofessional and simulation components enthusiastically. Attitudinal scores and responses were consistently positive among both medical and nursing students. Interprofessional education through simulation offers a promising approach to preparing future healthcare professionals for the collaborative models of healthcare delivery being developed internationally.
Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.
2016-01-01
The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.
Methodolgy For Evaluation Of Technology Impacts In Space Electric Power Systems
NASA Technical Reports Server (NTRS)
Holda, Julie
2004-01-01
The Analysis and Management branch of the Power and Propulsion Office at NASA Glenn Research Center is responsible for performing complex analyses of the space power and In-Space propulsion products developed by GRC. This work quantifies the benefits of the advanced technologies to support on-going advocacy efforts. The Power and Propulsion Office is committed to understanding how the advancement in space technologies could benefit future NASA missions. They support many diverse projects and missions throughout NASA as well as industry and academia. The area of work that we are concentrating on is space technology investment strategies. Our goal is to develop a Monte-Carlo based tool to investigate technology impacts in space electric power systems. The framework is being developed at this stage, which will be used to set up a computer simulation of a space electric power system (EPS). The outcome is expected to be a probabilistic assessment of critical technologies and potential development issues. We are developing methods for integrating existing spreadsheet-based tools into the simulation tool. Also, work is being done on defining interface protocols to enable rapid integration of future tools. Monte Carlo-based simulation programs for statistical modeling of the EPS Model. I decided to learn and evaluate Palisade's @Risk and Risk Optimizer software, and utilize it's capabilities for the Electric Power System (EPS) model. I also looked at similar software packages (JMP, SPSS, Crystal Ball, VenSim, Analytica) available from other suppliers and evaluated them. The second task was to develop the framework for the tool, in which we had to define technology characteristics using weighing factors and probability distributions. Also we had to define the simulation space and add hard and soft constraints to the model. The third task is to incorporate (preliminary) cost factors into the model. A final task is developing a cross-platform solution of this framework.
Jung, Joon -Hee
2016-10-11
Here, the global atmospheric models based on the Multi-scale Modeling Framework (MMF) are able to explicitly resolve subgrid-scale processes by using embedded 2-D Cloud-Resolving Models (CRMs). Up to now, however, those models do not include the orographic effects on the CRM grid scale. This study shows that the effects of CRM grid-scale orography can be simulated reasonably well by the Quasi-3-D MMF (Q3D MMF), which has been developed as a second-generation MMF. In the Q3D framework, the surface topography can be included in the CRM component by using a block representation of the mountains, so that no smoothing of themore » topographic height is necessary. To demonstrate the performance of such a model, the orographic effects over a steep mountain are simulated in an idealized experimental setup with each of the Q3D MMF and the full 3-D CRM. The latter is used as a benchmark. Comparison of the results shows that the Q3D MMF is able to reproduce the horizontal distribution of orographic precipitation and the flow changes around mountains as simulated by the 3-D CRM, even though the embedded CRMs of the Q3D MMF recognize only some aspects of the complex 3-D topography. It is also shown that the use of 3-D CRMs in the Q3D framework, rather than 2-D CRMs, has positive impacts on the simulation of wind fields but does not substantially change the simulated precipitation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Joon -Hee
Here, the global atmospheric models based on the Multi-scale Modeling Framework (MMF) are able to explicitly resolve subgrid-scale processes by using embedded 2-D Cloud-Resolving Models (CRMs). Up to now, however, those models do not include the orographic effects on the CRM grid scale. This study shows that the effects of CRM grid-scale orography can be simulated reasonably well by the Quasi-3-D MMF (Q3D MMF), which has been developed as a second-generation MMF. In the Q3D framework, the surface topography can be included in the CRM component by using a block representation of the mountains, so that no smoothing of themore » topographic height is necessary. To demonstrate the performance of such a model, the orographic effects over a steep mountain are simulated in an idealized experimental setup with each of the Q3D MMF and the full 3-D CRM. The latter is used as a benchmark. Comparison of the results shows that the Q3D MMF is able to reproduce the horizontal distribution of orographic precipitation and the flow changes around mountains as simulated by the 3-D CRM, even though the embedded CRMs of the Q3D MMF recognize only some aspects of the complex 3-D topography. It is also shown that the use of 3-D CRMs in the Q3D framework, rather than 2-D CRMs, has positive impacts on the simulation of wind fields but does not substantially change the simulated precipitation.« less
NASA Astrophysics Data System (ADS)
Jung, Joon-Hee
2016-12-01
The global atmospheric models based on the Multi-scale Modeling Framework (MMF) are able to explicitly resolve subgrid-scale processes by using embedded 2-D Cloud-Resolving Models (CRMs). Up to now, however, those models do not include the orographic effects on the CRM grid scale. This study shows that the effects of CRM grid-scale orography can be simulated reasonably well by the Quasi-3-D MMF (Q3D MMF), which has been developed as a second-generation MMF. In the Q3D framework, the surface topography can be included in the CRM component by using a block representation of the mountains, so that no smoothing of the topographic height is necessary. To demonstrate the performance of such a model, the orographic effects over a steep mountain are simulated in an idealized experimental setup with each of the Q3D MMF and the full 3-D CRM. The latter is used as a benchmark. Comparison of the results shows that the Q3D MMF is able to reproduce the horizontal distribution of orographic precipitation and the flow changes around mountains as simulated by the 3-D CRM, even though the embedded CRMs of the Q3D MMF recognize only some aspects of the complex 3-D topography. It is also shown that the use of 3-D CRMs in the Q3D framework, rather than 2-D CRMs, has positive impacts on the simulation of wind fields but does not substantially change the simulated precipitation.
OpenDanubia - An integrated, modular simulation system to support regional water resource management
NASA Astrophysics Data System (ADS)
Muerth, M.; Waldmann, D.; Heinzeller, C.; Hennicker, R.; Mauser, W.
2012-04-01
The already completed, multi-disciplinary research project GLOWA-Danube has developed a regional scale, integrated modeling system, which was successfully applied on the 77,000 km2 Upper Danube basin to investigate the impact of Global Change on both the natural and anthropogenic water cycle. At the end of the last project phase, the integrated modeling system was transferred into the open source project OpenDanubia, which now provides both the core system as well as all major model components to the general public. First, this will enable decision makers from government, business and management to use OpenDanubia as a tool for proactive management of water resources in the context of global change. Secondly, the model framework to support integrated simulations and all simulation models developed for OpenDanubia in the scope of GLOWA-Danube are further available for future developments and research questions. OpenDanubia allows for the investigation of water-related scenarios considering different ecological and economic aspects to support both scientists and policy makers to design policies for sustainable environmental management. OpenDanubia is designed as a framework-based, distributed system. The model system couples spatially distributed physical and socio-economic process during run-time, taking into account their mutual influence. To simulate the potential future impacts of Global Change on agriculture, industrial production, water supply, households and tourism businesses, so-called deep actor models are implemented in OpenDanubia. All important water-related fluxes and storages in the natural environment are implemented in OpenDanubia as spatially explicit, process-based modules. This includes the land surface water and energy balance, dynamic plant water uptake, ground water recharge and flow as well as river routing and reservoirs. Although the complete system is relatively demanding on data requirements and hardware requirements, the modular structure and the generic core system (Core Framework, Actor Framework) allows the application in new regions and the selection of a reduced number of modules for simulation. As part of the Open Source Initiative in GLOWA-Danube (opendanubia.glowa-danube.de) a comprehensive documentation for the system installation was created and both the program code of the framework and of all major components is licensed under the GNU General Public License. In addition, some helpful programs and scripts necessary for the operation and processing of input and result data sets are provided.
Framework for incorporating simulation into urology training.
Arora, Sonal; Lamb, Benjamin; Undre, Shabnam; Kneebone, Roger; Darzi, Ara; Sevdalis, Nick
2011-03-01
• Changes to working hours, new technologies and increased accountability have rendered the need for alternative training environments for urologists. • Simulation offers a promising arena for learning to take place in a safe, realistic setting. • Despite its benefits, the incorporation of simulation into urological training programmes remains minimal. • The current status and future directions of simulation for training in technical and non-technical skills are reviewed as they pertain to urology. • A framework is presented for how simulation-based training could be incorporated into the entire urological curriculum. • The literature on simulation in technical and non-technical skills training is reviewed, with a specific focus upon urology. • To fully integrate simulation into a training curriculum, its possibilities for addressing all the competencies required by a urologist must be realized. • At an early stage of training, simulation has been used to develop basic technical skills and cognitive skills, such as decision-making and communication. • At an intermediate stage, the studies focus upon more advanced technical skills learnt with virtual reality simulators. • Non-technical skills training would include leadership and could be delivered with in situ models. • At the final stage, experienced trainees can practise technical and non-technical skills in full crisis simulations situated within a fully-simulated operating rooms. • Simulation can provide training in the technical and non-technical skills required to be a competent urologist. • The framework presented may guide how best to incorporate simulation into training curricula. • Future work should determine whether acquired skills transfer to clinical practice and improve patient care. © 2010 THE AUTHORS. BJU INTERNATIONAL © 2010 BJU INTERNATIONAL.
NASA Astrophysics Data System (ADS)
Lyu, Pin; Chen, Wenli; Li, Hui; Shen, Lian
2017-11-01
In recent studies, Yang, Meneveau & Shen (Physics of Fluids, 2014; Renewable Energy, 2014) developed a hybrid numerical framework for simulation of offshore wind farm. The framework consists of simulation of nonlinear surface waves using a high-order spectral method, large-eddy simulation of wind turbulence on a wave-surface-fitted curvilinear grid, and an actuator disk model for wind turbines. In the present study, several more precise wind turbine models, including the actuator line model, actuator disk model with rotation, and nacelle model, are introduced into the computation. Besides offshore wind turbines on fixed piles, the new computational framework has the capability to investigate the interaction among wind, waves, and floating wind turbines. In this study, onshore, offshore fixed pile, and offshore floating wind farms are compared in terms of flow field statistics and wind turbine power extraction rate. The authors gratefully acknowledge financial support from China Scholarship Council (No. 201606120186) and the Institute on the Environment of University of Minnesota.
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
Laskowski, Marek; Demianyk, Bryan C P; Witt, Julia; Mukhi, Shamir N; Friesen, Marcia R; McLeod, Robert D
2011-11-01
The objective of this paper was to develop an agent-based modeling framework in order to simulate the spread of influenza virus infection on a layout based on a representative hospital emergency department in Winnipeg, Canada. In doing so, the study complements mathematical modeling techniques for disease spread, as well as modeling applications focused on the spread of antibiotic-resistant nosocomial infections in hospitals. Twenty different emergency department scenarios were simulated, with further simulation of four infection control strategies. The agent-based modeling approach represents systems modeling, in which the emergency department was modeled as a collection of agents (patients and healthcare workers) and their individual characteristics, behaviors, and interactions. The framework was coded in C++ using Qt4 libraries running under the Linux operating system. A simple ordinary least squares (OLS) regression was used to analyze the data, in which the percentage of patients that became infected in one day within the simulation was the dependent variable. The results suggest that within the given instance context, patient-oriented infection control policies (alternate treatment streams, masking symptomatic patients) tend to have a larger effect than policies that target healthcare workers. The agent-based modeling framework is a flexible tool that can be made to reflect any given environment; it is also a decision support tool for practitioners and policymakers to assess the relative impact of infection control strategies. The framework illuminates scenarios worthy of further investigation, as well as counterintuitive findings.
NASA Astrophysics Data System (ADS)
Löwe, Roland; Urich, Christian; Sto. Domingo, Nina; Mark, Ole; Deletic, Ana; Arnbjerg-Nielsen, Karsten
2017-07-01
We present a new framework for flexible testing of flood risk adaptation strategies in a variety of urban development and climate scenarios. This framework couples the 1D-2D hydrodynamic simulation package MIKE FLOOD with the agent-based urban development model DAnCE4Water and provides the possibility to systematically test various flood risk adaptation measures ranging from large infrastructure changes over decentralised water management to urban planning policies. We have tested the framework in a case study in Melbourne, Australia considering 9 scenarios for urban development and climate and 32 potential combinations of flood adaptation measures. We found that the performance of adaptation measures strongly depended on the considered climate and urban development scenario and the other implementation measures implemented, suggesting that adaptive strategies are preferable over one-off investments. Urban planning policies proved to be an efficient means for the reduction of flood risk, while implementing property buyback and pipe increases in a guideline-oriented manner was too costly. Random variations in location and time point of urban development could have significant impact on flood risk and would in some cases outweigh the benefits of less efficient adaptation strategies. The results of our setup can serve as an input for robust decision making frameworks and thus support the identification of flood risk adaptation measures that are economically efficient and robust to variations of climate and urban layout.
Molecular simulation of gas adsorption and diffusion in a breathing MOF using a rigid force field.
García-Pérez, E; Serra-Crespo, P; Hamad, S; Kapteijn, F; Gascon, J
2014-08-14
Simulation of gas adsorption in flexible porous materials is still limited by the slow progress in the development of flexible force fields. Moreover, the high computational cost of such flexible force fields may be a drawback even when they are fully developed. In this work, molecular simulations of gas adsorption and diffusion of carbon dioxide and methane in NH2-MIL-53(Al) are carried out using a linear combination of two crystallographic structures with rigid force fields. Once the interactions of carbon dioxide molecules and the bridging hydroxyls groups of the framework are optimized, an excellent match is found for simulations and experimental data for the adsorption of methane and carbon dioxide, including the stepwise uptake due to the breathing effect. In addition, diffusivities of pure components are calculated. The pore expansion by the breathing effect influences the self-diffusion mechanism and much higher diffusivities are observed at relatively high adsorbate loadings. This work demonstrates that using a rigid force field combined with a minimum number of experiments, reproduces adsorption and simulates diffusion of carbon dioxide and methane in the flexible metal-organic framework NH2-MIL-53(Al).
NASA Astrophysics Data System (ADS)
Trindade, B. C.; Reed, P. M.
2017-12-01
The growing access and reduced cost for computing power in recent years has promoted rapid development and application of multi-objective water supply portfolio planning. As this trend continues there is a pressing need for flexible risk-based simulation frameworks and improved algorithm benchmarking for emerging classes of water supply planning and management problems. This work contributes the Water Utilities Management and Planning (WUMP) model: a generalizable and open source simulation framework designed to capture how water utilities can minimize operational and financial risks by regionally coordinating planning and management choices, i.e. making more efficient and coordinated use of restrictions, water transfers and financial hedging combined with possible construction of new infrastructure. We introduce the WUMP simulation framework as part of a new multi-objective benchmark problem for planning and management of regionally integrated water utility companies. In this problem, a group of fictitious water utilities seek to balance the use of the mentioned reliability driven actions (e.g., restrictions, water transfers and infrastructure pathways) and their inherent financial risks. Several traits of this problem make it ideal for a benchmark problem, namely the presence of (1) strong non-linearities and discontinuities in the Pareto front caused by the step-wise nature of the decision making formulation and by the abrupt addition of storage through infrastructure construction, (2) noise due to the stochastic nature of the streamflows and water demands, and (3) non-separability resulting from the cooperative formulation of the problem, in which decisions made by stakeholder may substantially impact others. Both the open source WUMP simulation framework and its demonstration in a challenging benchmarking example hold value for promoting broader advances in urban water supply portfolio planning for regions confronting change.
Theoretical Grounds for the Propagation of Uncertainties in Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Saracco, Paolo; Pia, Maria Grazia; Batic, Matej
2014-04-01
We introduce a theoretical framework for the calculation of uncertainties affecting observables produced by Monte Carlo particle transport, which derive from uncertainties in physical parameters input into simulation. The theoretical developments are complemented by a heuristic application, which illustrates the method of calculation in a streamlined simulation environment.
Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo
2018-01-01
It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.
Advanced Information Technology in Simulation Based Life Cycle Design
NASA Technical Reports Server (NTRS)
Renaud, John E.
2003-01-01
In this research a Collaborative Optimization (CO) approach for multidisciplinary systems design is used to develop a decision based design framework for non-deterministic optimization. To date CO strategies have been developed for use in application to deterministic systems design problems. In this research the decision based design (DBD) framework proposed by Hazelrigg is modified for use in a collaborative optimization framework. The Hazelrigg framework as originally proposed provides a single level optimization strategy that combines engineering decisions with business decisions in a single level optimization. By transforming this framework for use in collaborative optimization one can decompose the business and engineering decision making processes. In the new multilevel framework of Decision Based Collaborative Optimization (DBCO) the business decisions are made at the system level. These business decisions result in a set of engineering performance targets that disciplinary engineering design teams seek to satisfy as part of subspace optimizations. The Decision Based Collaborative Optimization framework more accurately models the existing relationship between business and engineering in multidisciplinary systems design.
Rattner, Alexander S.; Guillen, Donna Post; Joshi, Alark; ...
2016-03-17
Photo- and physically realistic techniques are often insufficient for visualization of fluid flow simulations, especially for 3D and time-varying studies. Substantial research effort has been dedicated to the development of non-photorealistic and illustration-inspired visualization techniques for compact and intuitive presentation of such complex datasets. However, a great deal of work has been reproduced in this field, as many research groups have developed specialized visualization software. Additionally, interoperability between illustrative visualization software is limited due to diverse processing and rendering architectures employed in different studies. In this investigation, a framework for illustrative visualization is proposed, and implemented in MarmotViz, a ParaViewmore » plug-in, enabling its use on a variety of computing platforms with various data file formats and mesh geometries. Region-of-interest identification and feature-tracking algorithms incorporated into this tool are described. Implementations of multiple illustrative effect algorithms are also presented to demonstrate the use and flexibility of this framework. Here, by providing an integrated framework for illustrative visualization of CFD data, MarmotViz can serve as a valuable asset for the interpretation of simulations of ever-growing scale.« less
Self-Reflection of Video-Recorded High-Fidelity Simulations and Development of Clinical Judgment.
Bussard, Michelle E
2016-09-01
Nurse educators are increasingly using high-fidelity simulators to improve prelicensure nursing students' ability to develop clinical judgment. Traditionally, oral debriefing sessions have immediately followed the simulation scenarios as a method for students to connect theory to practice and therefore develop clinical judgment. Recently, video recording of the simulation scenarios is being incorporated. This qualitative, interpretive description study was conducted to identify whether self-reflection on video-recorded high-fidelity simulation (HFS) scenarios helped prelicensure nursing students to develop clinical judgment. Tanner's clinical judgment model was the framework for this study. Four themes emerged from this study: Confidence, Communication, Decision Making, and Change in Clinical Practice. This study indicated that self-reflection of video-recorded HFS scenarios is beneficial for prelicensure nursing students to develop clinical judgment. [J Nurs Educ. 2016;55(9):522-527.]. Copyright 2016, SLACK Incorporated.
A system for environmental model coupling and code reuse: The Great Rivers Project
NASA Astrophysics Data System (ADS)
Eckman, B.; Rice, J.; Treinish, L.; Barford, C.
2008-12-01
As part of the Great Rivers Project, IBM is collaborating with The Nature Conservancy and the Center for Sustainability and the Global Environment (SAGE) at the University of Wisconsin, Madison to build a Modeling Framework and Decision Support System (DSS) designed to help policy makers and a variety of stakeholders (farmers, fish & wildlife managers, hydropower operators, et al.) to assess, come to consensus, and act on land use decisions representing effective compromises between human use and ecosystem preservation/restoration. Initially focused on Brazil's Paraguay-Parana, China's Yangtze, and the Mississippi Basin in the US, the DSS integrates data and models from a wide variety of environmental sectors, including water balance, water quality, carbon balance, crop production, hydropower, and biodiversity. In this presentation we focus on the modeling framework aspect of this project. In our approach to these and other environmental modeling projects, we see a flexible, extensible modeling framework infrastructure for defining and running multi-step analytic simulations as critical. In this framework, we divide monolithic models into atomic components with clearly defined semantics encoded via rich metadata representation. Once models and their semantics and composition rules have been registered with the system by their authors or other experts, non-expert users may construct simulations as workflows of these atomic model components. A model composition engine enforces rules/constraints for composing model components into simulations, to avoid the creation of Frankenmodels, models that execute but produce scientifically invalid results. A common software environment and common representations of data and models are required, as well as an adapter strategy for code written in e.g., Fortran or python, that still enables efficient simulation runs, including parallelization. Since each new simulation, as a new composition of model components, requires calibration of parameters (fudge factors) to produce scientifically valid results, we are also developing an autocalibration engine. Finally, visualization is a key element of this modeling framework strategy, both to convey complex scientific data effectively, and also to enable non-expert users to make full use of the relevant features of the framework. We are developing a visualization environment with a strong data model, to enable visualizations, model results, and data all to be handled similarly.
Pressure calculation in hybrid particle-field simulations
NASA Astrophysics Data System (ADS)
Milano, Giuseppe; Kawakatsu, Toshihiro
2010-12-01
In the framework of a recently developed scheme for a hybrid particle-field simulation techniques where self-consistent field (SCF) theory and particle models (molecular dynamics) are combined [J. Chem. Phys. 130, 214106 (2009)], we developed a general formulation for the calculation of instantaneous pressure and stress tensor. The expressions have been derived from statistical mechanical definition of the pressure starting from the expression for the free energy functional in the SCF theory. An implementation of the derived formulation suitable for hybrid particle-field molecular dynamics-self-consistent field simulations is described. A series of test simulations on model systems are reported comparing the calculated pressure with those obtained from standard molecular dynamics simulations based on pair potentials.
Developing a comprehensive framework for eutrophication management in off-stream artificial lakes
NASA Astrophysics Data System (ADS)
Khorasani, Hamed; Kerachian, Reza; Malakpour-Estalaki, Siamak
2018-07-01
In this paper, a comprehensive and interdisciplinary framework for management of eutrophication in off-stream artificial lakes in semi-arid and arid regions is proposed. Identification of the lake's water resources system components and stakeholders, simulation of Phosphorus (P) export from upstream watershed, simulation of the lake water quality as well as simulation of water demands and supply, development of management scenarios for the lake and selecting the best scenario using social choice methods (i.e. discrete and fuzzy Borda counts) are the four main parts of the framework. The proposed framework is applied on Chitgar Artificial Lake (ChAL), the largest intra-urban artificial lake in Tehran which has been constructed in 2010-2013 for recreational purposes. The Load Apportionment Model (LAM) is used for the simulation of P loads from the point and non-point (diffusive) sources and the LakeMab model is used for the simulation of P dynamics in the lake. The management scenarios contain optimized rule curves for water intake/outtake blended with P management plans (i.e. removal of point sources of P load in the upstream watershed, construction of a hydroponic bio-filter or an advanced water treatment plant beside the lake for reduction of external loading of P and recycling lake water, alum treatment of lake sediments for controlling the internal loading of P as well as construction of a dry detention basin). The most preferred scenarios selected by the discrete Borda count are the low-cost alum treatment and dry detention basin, while the most preferred scenario according to fuzzy Borda count, which considers the uncertainty of model inputs, is the costly water treatment plant. In all preferred scenarios, water intake is conducted from flood flows in order to avoid conflict with downstream agricultural demands. In addition to decentralized decision making and stakeholders' participation, the proposed framework promotes the integration of the technical aspects such as the role of internal loading in lake eutrophication and separation of flood and non-flood flows in the off-stream lakes' systems.
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.
NASA Technical Reports Server (NTRS)
Chapman, Jeffryes W.; Lavelle, Thomas M.; May, Ryan D.; Litt, Jonathan S.; Guo, Ten-Huei
2014-01-01
A simulation toolbox has been developed for the creation of both steady-state and dynamic thermodynamic software models. This paper describes the Toolbox for the Modeling and Analysis of Thermodynamic Systems (T-MATS), which combines generic thermodynamic and controls modeling libraries with a numerical iterative solver to create a framework for the development of thermodynamic system simulations, such as gas turbine engines. The objective of this paper is to present an overview of T-MATS, the theory used in the creation of the module sets, and a possible propulsion simulation architecture. A model comparison was conducted by matching steady-state performance results from a T-MATS developed gas turbine simulation to a well-documented steady-state simulation. Transient modeling capabilities are then demonstrated when the steady-state T-MATS model is updated to run dynamically.
Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations. Volume 2; Appendices
NASA Technical Reports Server (NTRS)
Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.
2015-01-01
This NASA Engineering and Safety Center (NESC) assessment was established to develop a set of time histories for the flight behavior of increasingly complex example aerospacecraft that could be used to partially validate various simulation frameworks. The assessment was conducted by representatives from several NASA Centers and an open-source simulation project. This document contains details on models, implementation, and results.
Modeling and simulating industrial land-use evolution in Shanghai, China
NASA Astrophysics Data System (ADS)
Qiu, Rongxu; Xu, Wei; Zhang, John; Staenz, Karl
2018-01-01
This study proposes a cellular automata-based Industrial and Residential Land Use Competition Model to simulate the dynamic spatial transformation of industrial land use in Shanghai, China. In the proposed model, land development activities in a city are delineated as competitions among different land-use types. The Hedonic Land Pricing Model is adopted to implement the competition framework. To improve simulation results, the Land Price Agglomeration Model was devised to simulate and adjust classic land price theory. A new evolutionary algorithm-based parameter estimation method was devised in place of traditional methods. Simulation results show that the proposed model closely resembles actual land transformation patterns and the model can not only simulate land development, but also redevelopment processes in metropolitan areas.
Suleiman, Ahmed Abbas; Frechen, Sebastian; Scheffler, Matthias; Zander, Thomas; Nogova, Lucia; Kocher, Martin; Jaehde, Ulrich; Wolf, Jürgen; Fuhr, Uwe
2015-11-01
Treatment with erlotinib, an epidermal growth factor receptor tyrosine kinase inhibitor used for treating non-small-cell lung cancer (NSCLC) and other cancers, is frequently associated with adverse events (AE). We present a modeling and simulation framework for the most common erlotinib-induced AE, rash, and diarrhea, providing insights into erlotinib toxicity. We used the framework to investigate the safety of high-dose erlotinib pulses proposed to limit acquired resistance while treating NSCLC. Continuous-time Markov models were developed using rash and diarrhea AE data from 39 NSCLC patients treated with erlotinib (150 mg/day). Exposure and different covariates were investigated as predictors of variability. Rash was also tested as a survival predictor. Models developed were used in a simulation analysis to compare the toxicities of different regimens, including the previously mentioned pulsed strategy. Probabilities of experiencing rash or diarrhea were found to be highest early during treatment. Rash, but not diarrhea, was positively correlated with erlotinib exposure. In contrast with some common understandings, radiotherapy decreased transitioning to higher rash grades by 81% (p < 0.01), and experiencing rash was not correlated with positive survival outcomes. Model simulations predicted that the proposed pulsed regimen (1600 mg/week + 50 mg/day remaining week days) results in a maximum of 20% of the patients suffering from severe rash throughout the treatment course in comparison to 12% when treated with standard dosing (150 mg/day). In conclusion, the framework demonstrated that radiotherapy attenuates erlotinib-induced rash, providing an opportunity to use radiotherapy and erlotinib together, and demonstrated the tolerability of high-dose pulses intended to address acquired resistance to erlotinib.
NASA Astrophysics Data System (ADS)
Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro
2016-08-01
We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication necessary for the interaction calculation. We discuss how we can overcome these bottlenecks.
Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations
Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting; ...
2018-03-28
Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less
Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting
Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less
High-resolution, regional-scale crop yield simulations for the Southwestern United States
NASA Astrophysics Data System (ADS)
Stack, D. H.; Kafatos, M.; Medvigy, D.; El-Askary, H. M.; Hatzopoulos, N.; Kim, J.; Kim, S.; Prasad, A. K.; Tremback, C.; Walko, R. L.; Asrar, G. R.
2012-12-01
Over the past few decades, there have been many process-based crop models developed with the goal of better understanding the impacts of climate, soils, and management decisions on crop yields. These models simulate the growth and development of crops in response to environmental drivers. Traditionally, process-based crop models have been run at the individual farm level for yield optimization and management scenario testing. Few previous studies have used these models over broader geographic regions, largely due to the lack of gridded high-resolution meteorological and soil datasets required as inputs for these data intensive process-based models. In particular, assessment of regional-scale yield variability due to climate change requires high-resolution, regional-scale, climate projections, and such projections have been unavailable until recently. The goal of this study was to create a framework for extending the Agricultural Production Systems sIMulator (APSIM) crop model for use at regional scales and analyze spatial and temporal yield changes in the Southwestern United States (CA, AZ, and NV). Using the scripting language Python, an automated pipeline was developed to link Regional Climate Model (RCM) output with the APSIM crop model, thus creating a one-way nested modeling framework. This framework was used to combine climate, soil, land use, and agricultural management datasets in order to better understand the relationship between climate variability and crop yield at the regional-scale. Three different RCMs were used to drive APSIM: OLAM, RAMS, and WRF. Preliminary results suggest that, depending on the model inputs, there is some variability between simulated RCM driven maize yields and historical yields obtained from the United States Department of Agriculture (USDA). Furthermore, these simulations showed strong non-linear correlations between yield and meteorological drivers, with critical threshold values for some of the inputs (e.g. minimum and maximum temperature), beyond which the yields were negatively affected. These results are now being used for further regional-scale yield analysis as the aforementioned framework is adaptable to multiple geographic regions and crop types.
Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design
NASA Technical Reports Server (NTRS)
Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.
2000-01-01
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design
NASA Technical Reports Server (NTRS)
Pritchett, Amy R.
2002-01-01
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
Assessing the Benefits and Costs of Motion for C-17 Flight Simulators: Technical Appendixes.
1986-06-01
Conference, NAECON, 1983. 4’ U-. - 182 - Instructional System Development, AF Manual 50-2, USAF, May 25, 1979. Irish , P.A., and G.H. Buckland, "Effects of...control augmentation system ; (4) the fidelity of different siirulator motion cueing alternatives; (5) a suggested methodology for assessinq the...evaluating the benefits and costs of incorporating motion systems in C-17 transport aircraft flight simulators and in developing a general framework
The Offline Software Framework of the NA61/SHINE Experiment
NASA Astrophysics Data System (ADS)
Sipos, Roland; Laszlo, Andras; Marcinek, Antoni; Paul, Tom; Szuba, Marek; Unger, Michael; Veberic, Darko; Wyszynski, Oskar
2012-12-01
NA61/SHINE (SHINE = SPS Heavy Ion and Neutrino Experiment) is an experiment at the CERN SPS using the upgraded NA49 hadron spectrometer. Among its physics goals are precise hadron production measurements for improving calculations of the neutrino beam flux in the T2K neutrino oscillation experiment as well as for more reliable simulations of cosmic-ray air showers. Moreover, p+p, p+Pb and nucleus+nucleus collisions will be studied extensively to allow for a study of properties of the onset of deconfinement and search for the critical point of strongly interacting matter. Currently NA61/SHINE uses the old NA49 software framework for reconstruction, simulation and data analysis. The core of this legacy framework was developed in the early 1990s. It is written in different programming and scripting languages (C, pgi-Fortran, shell) and provides several concurrent data formats for the event data model, which includes also obsolete parts. In this contribution we will introduce the new software framework, called Shine, that is written in C++ and designed to comprise three principal parts: a collection of processing modules which can be assembled and sequenced by the user via XML files, an event data model which contains all simulation and reconstruction information based on STL and ROOT streaming, and a detector description which provides data on the configuration and state of the experiment. To assure a quick migration to the Shine framework, wrappers were introduced that allow to run legacy code parts as modules in the new framework and we will present first results on the cross validation of the two frameworks.
Managing complexity in simulations of land surface and near-surface processes
Coon, Ethan T.; Moulton, J. David; Painter, Scott L.
2016-01-12
Increasing computing power and the growing role of simulation in Earth systems science have led to an increase in the number and complexity of processes in modern simulators. We present a multiphysics framework that specifies interfaces for coupled processes and automates weak and strong coupling strategies to manage this complexity. Process management is enabled by viewing the system of equations as a tree, where individual equations are associated with leaf nodes and coupling strategies with internal nodes. A dynamically generated dependency graph connects a variable to its dependencies, streamlining and automating model evaluation, easing model development, and ensuring models aremore » modular and flexible. Additionally, the dependency graph is used to ensure that data requirements are consistent between all processes in a given simulation. Here we discuss the design and implementation of these concepts within the Arcos framework, and demonstrate their use for verification testing and hypothesis evaluation in numerical experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monozov, Dmitriy; Lukie, Zarija
2016-04-01
Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less
Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions
NASA Technical Reports Server (NTRS)
Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.
2011-01-01
A surrogate model methodology is described for predicting in real time the residual strength of flight structures with discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. A residual strength test of a metallic, integrally-stiffened panel is simulated to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data would, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high-fidelity fracture simulation framework provide useful tools for adaptive flight technology.
Advanced computational simulations of water waves interacting with wave energy converters
NASA Astrophysics Data System (ADS)
Pathak, Ashish; Freniere, Cole; Raessi, Mehdi
2017-03-01
Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.
Surrogate Modeling of High-Fidelity Fracture Simulations for Real-Time Residual Strength Predictions
NASA Technical Reports Server (NTRS)
Spear, Ashley D.; Priest, Amanda R.; Veilleux, Michael G.; Ingraffea, Anthony R.; Hochhalter, Jacob D.
2011-01-01
A surrogate model methodology is described for predicting, during flight, the residual strength of aircraft structures that sustain discrete-source damage. Starting with design of experiment, an artificial neural network is developed that takes as input discrete-source damage parameters and outputs a prediction of the structural residual strength. Target residual strength values used to train the artificial neural network are derived from 3D finite element-based fracture simulations. Two ductile fracture simulations are presented to show that crack growth and residual strength are determined more accurately in discrete-source damage cases by using an elastic-plastic fracture framework rather than a linear-elastic fracture mechanics-based method. Improving accuracy of the residual strength training data does, in turn, improve accuracy of the surrogate model. When combined, the surrogate model methodology and high fidelity fracture simulation framework provide useful tools for adaptive flight technology.
Flexible Residential Smart Grid Simulation Framework
NASA Astrophysics Data System (ADS)
Xiang, Wang
Different scheduling and coordination algorithms controlling household appliances' operations can potentially lead to energy consumption reduction and/or load balancing in conjunction with different electricity pricing methods used in smart grid programs. In order to easily implement different algorithms and evaluate their efficiency against other ideas, a flexible simulation framework is desirable in both research and business fields. However, such a platform is currently lacking or underdeveloped. In this thesis, we provide a simulation framework to focus on demand side residential energy consumption coordination in response to different pricing methods. This simulation framework, equipped with an appliance consumption library using realistic values, aims to closely represent the average usage of different types of appliances. The simulation results of traditional usage yield close matching values compared to surveyed real life consumption records. Several sample coordination algorithms, pricing schemes, and communication scenarios are also implemented to illustrate the use of the simulation framework.
Shabaev, Andrew; Lambrakos, Samuel G; Bernstein, Noam; Jacobs, Verne L; Finkenstadt, Daniel
2011-04-01
We have developed a general framework for numerical simulation of various types of scenarios that can occur for the detection of improvised explosive devices (IEDs) through the use of excitation using incident electromagnetic waves. A central component model of this framework is an S-matrix representation of a multilayered composite material system. Each layer of the system is characterized by an average thickness and an effective electric permittivity function. The outputs of this component are the reflectivity and the transmissivity as functions of frequency and angle of the incident electromagnetic wave. The input of the component is a parameterized analytic-function representation of the electric permittivity as a function of frequency, which is provided by another component model of the framework. The permittivity function is constructed by fitting response spectra calculated using density functional theory (DFT) and parameter adjustment according to any additional information that may be available, e.g., experimentally measured spectra or theory-based assumptions concerning spectral features. A prototype simulation is described that considers response characteristics for THz excitation of the high explosive β-HMX. This prototype simulation includes a description of a procedure for calculating response spectra using DFT as input to the Smatrix model. For this purpose, the DFT software NRLMOL was adopted. © 2011 Society for Applied Spectroscopy
Kovalchuk, Sergey V; Funkner, Anastasia A; Metsker, Oleg G; Yakovlev, Aleksey N
2018-06-01
An approach to building a hybrid simulation of patient flow is introduced with a combination of data-driven methods for automation of model identification. The approach is described with a conceptual framework and basic methods for combination of different techniques. The implementation of the proposed approach for simulation of the acute coronary syndrome (ACS) was developed and used in an experimental study. A combination of data, text, process mining techniques, and machine learning approaches for the analysis of electronic health records (EHRs) with discrete-event simulation (DES) and queueing theory for the simulation of patient flow was proposed. The performed analysis of EHRs for ACS patients enabled identification of several classes of clinical pathways (CPs) which were used to implement a more realistic simulation of the patient flow. The developed solution was implemented using Python libraries (SimPy, SciPy, and others). The proposed approach enables more a realistic and detailed simulation of the patient flow within a group of related departments. An experimental study shows an improved simulation of patient length of stay for ACS patient flow obtained from EHRs in Almazov National Medical Research Centre in Saint Petersburg, Russia. The proposed approach, methods, and solutions provide a conceptual, methodological, and programming framework for the implementation of a simulation of complex and diverse scenarios within a flow of patients for different purposes: decision making, training, management optimization, and others. Copyright © 2018 Elsevier Inc. All rights reserved.
An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.
2017-01-01
Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.
Real-Time and High-Fidelity Simulation Environment for Autonomous Ground Vehicle Dynamics
NASA Technical Reports Server (NTRS)
Cameron, Jonathan; Myint, Steven; Kuo, Calvin; Jain, Abhi; Grip, Havard; Jayakumar, Paramsothy; Overholt, Jim
2013-01-01
This paper reports on a collaborative project between U.S. Army TARDEC and Jet Propulsion Laboratory (JPL) to develop a unmanned ground vehicle (UGV) simulation model using the ROAMS vehicle modeling framework. Besides modeling the physical suspension of the vehicle, the sensing and navigation of the HMMWV vehicle are simulated. Using models of urban and off-road environments, the HMMWV simulation was tested in several ways, including navigation in an urban environment with obstacle avoidance and the performance of a lane change maneuver.
The NASA Auralization Framework and Plugin Architecture
NASA Technical Reports Server (NTRS)
Aumann, Aric R.; Tuttle, Brian C.; Chapin, William L.; Rizzi, Stephen A.
2015-01-01
NASA has a long history of investigating human response to aircraft flyover noise and in recent years has developed a capability to fully auralize the noise of aircraft during their design. This capability is particularly useful for unconventional designs with noise signatures significantly different from the current fleet. To that end, a flexible software architecture has been developed to facilitate rapid integration of new simulation techniques for noise source synthesis and propagation, and to foster collaboration amongst researchers through a common releasable code base. The NASA Auralization Framework (NAF) is a skeletal framework written in C++ with basic functionalities and a plugin architecture that allows users to mix and match NAF capabilities with their own methods through the development and use of dynamically linked libraries. This paper presents the NAF software architecture and discusses several advanced auralization techniques that have been implemented as plugins to the framework.
Li, Ke; Zhang, Peng; Crittenden, John C; Guhathakurta, Subhrajit; Chen, Yongsheng; Fernando, Harindra; Sawhney, Anil; McCartney, Peter; Grimm, Nancy; Kahhat, Ramzy; Joshi, Himanshu; Konjevod, Goran; Choi, Yu-Jin; Fonseca, Ernesto; Allenby, Braden; Gerrity, Daniel; Torrens, Paul M
2007-07-15
To encourage sustainable development, engineers and scientists need to understand the interactions among social decision-making, development and redevelopment, land, energy and material use, and their environmental impacts. In this study, a framework that connects these interactions was proposed to guide more sustainable urban planning and construction practices. Focusing on the rapidly urbanizing setting of Phoenix, Arizona, complexity models and deterministic models were assembled as a metamodel, which is called Sustainable Futures 2100 and were used to predict land use and development, to quantify construction material demands, to analyze the life cycle environmental impacts, and to simulate future ground-level ozone formation.
MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance
2014-01-01
Background Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. Methods The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. Results MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. Conclusion The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions. PMID:25204441
NASA Astrophysics Data System (ADS)
Manstetten, Paul; Filipovic, Lado; Hössinger, Andreas; Weinbub, Josef; Selberherr, Siegfried
2017-02-01
We present a computationally efficient framework to compute the neutral flux in high aspect ratio structures during three-dimensional plasma etching simulations. The framework is based on a one-dimensional radiosity approach and is applicable to simulations of convex rotationally symmetric holes and convex symmetric trenches with a constant cross-section. The framework is intended to replace the full three-dimensional simulation step required to calculate the neutral flux during plasma etching simulations. Especially for high aspect ratio structures, the computational effort, required to perform the full three-dimensional simulation of the neutral flux at the desired spatial resolution, conflicts with practical simulation time constraints. Our results are in agreement with those obtained by three-dimensional Monte Carlo based ray tracing simulations for various aspect ratios and convex geometries. With this framework we present a comprehensive analysis of the influence of the geometrical properties of high aspect ratio structures as well as of the particle sticking probability on the neutral particle flux.
Simulation framework for intelligent transportation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewing, T.; Doss, E.; Hanebutte, U.
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphicalmore » user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.« less
NASA Astrophysics Data System (ADS)
Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.
2017-12-01
Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.
Dshell++: A Component Based, Reusable Space System Simulation Framework
NASA Technical Reports Server (NTRS)
Lim, Christopher S.; Jain, Abhinandan
2009-01-01
This paper describes the multi-mission Dshell++ simulation framework for high fidelity, physics-based simulation of spacecraft, robotic manipulation and mobility systems. Dshell++ is a C++/Python library which uses modern script driven object-oriented techniques to allow component reuse and a dynamic run-time interface for complex, high-fidelity simulation of spacecraft and robotic systems. The goal of the Dshell++ architecture is to manage the inherent complexity of physicsbased simulations while supporting component model reuse across missions. The framework provides several features that support a large degree of simulation configurability and usability.
A lava flow simulation model for the development of volcanic hazard maps for Mount Etna (Italy)
NASA Astrophysics Data System (ADS)
Damiani, M. L.; Groppelli, G.; Norini, G.; Bertino, E.; Gigliuto, A.; Nucita, A.
2006-05-01
Volcanic hazard assessment is of paramount importance for the safeguard of the resources exposed to volcanic hazards. In the paper we present ELFM, a lava flow simulation model for the evaluation of the lava flow hazard on Mount Etna (Sicily, Italy), the most important active volcano in Europe. The major contributions of the paper are: (a) a detailed specification of the lava flow simulation model and the specification of an algorithm implementing it; (b) the definition of a methodological framework for applying the model to the specific volcano. For what concerns the former issue, we propose an extended version of an existing stochastic model that has been applied so far only to the assessment of the volcanic hazard on Lanzarote and Tenerife (Canary Islands). Concerning the methodological framework, we claim model validation is definitely needed for assessing the effectiveness of the lava flow simulation model. To that extent a strategy has been devised for the generation of simulation experiments and evaluation of their outcomes.
Design of an immersive simulator for assisted power wheelchair driving.
Devigne, Louise; Babel, Marie; Nouviale, Florian; Narayanan, Vishnu K; Pasteau, Francois; Gallien, Philippe
2017-07-01
Driving a power wheelchair is a difficult and complex visual-cognitive task. As a result, some people with visual and/or cognitive disabilities cannot access the benefits of a power wheelchair because their impairments prevent them from driving safely. In order to improve their access to mobility, we have previously designed a semi-autonomous assistive wheelchair system which progressively corrects the trajectory as the user manually drives the wheelchair and smoothly avoids obstacles. Developing and testing such systems for wheelchair driving assistance requires a significant amount of material resources and clinician time. With Virtual Reality technology, prototypes can be developed and tested in a risk-free and highly flexible Virtual Environment before equipping and testing a physical prototype. Additionally, users can "virtually" test and train more easily during the development process. In this paper, we introduce a power wheelchair driving simulator allowing the user to navigate with a standard wheelchair in an immersive 3D Virtual Environment. The simulation framework is designed to be flexible so that we can use different control inputs. In order to validate the framework, we first performed tests on the simulator with able-bodied participants during which the user's Quality of Experience (QoE) was assessed through a set of questionnaires. Results show that the simulator is a promising tool for future works as it generates a good sense of presence and requires rather low cognitive effort from users.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyer, M. D.; Andre, R.; Gates, D. A.
The high-performance operational goals of NSTX-U will require development of advanced feedback control algorithms, including control of ßN and the safety factor profile. In this work, a novel approach to simultaneously controlling ßN and the value of the safety factor on the magnetic axis, q0, through manipulation of the plasma boundary shape and total beam power, is proposed. Simulations of the proposed scheme show promising results and motivate future experimental implementation and eventual integration into a more complex current profile control scheme planned to include actuation of individual beam powers, density, and loop voltage. As part of this work, amore » flexible framework for closed loop simulations within the high-fidelity code TRANSP was developed. The framework, used here to identify control-design-oriented models and to tune and test the proposed controller, exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc.). The flexible framework should enable high-fidelity testing of a variety of control algorithms, thereby reducing the amount of expensive experimental time needed to implement new control algorithms on NSTX-U and other devices.« less
NASA Astrophysics Data System (ADS)
Boyer, M. D.; Andre, R.; Gates, D. A.; Gerhardt, S.; Goumiri, I. R.; Menard, J.
2015-05-01
The high-performance operational goals of NSTX-U will require development of advanced feedback control algorithms, including control of βN and the safety factor profile. In this work, a novel approach to simultaneously controlling βN and the value of the safety factor on the magnetic axis, q0, through manipulation of the plasma boundary shape and total beam power, is proposed. Simulations of the proposed scheme show promising results and motivate future experimental implementation and eventual integration into a more complex current profile control scheme planned to include actuation of individual beam powers, density, and loop voltage. As part of this work, a flexible framework for closed loop simulations within the high-fidelity code TRANSP was developed. The framework, used here to identify control-design-oriented models and to tune and test the proposed controller, exploits many of the predictive capabilities of TRANSP and provides a means for performing control calculations based on user-supplied data (controller matrices, target waveforms, etc). The flexible framework should enable high-fidelity testing of a variety of control algorithms, thereby reducing the amount of expensive experimental time needed to implement new control algorithms on NSTX-U and other devices.
An Update on Improvements to NiCE Support for RELAP-7
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alex; Wojtowicz, Anna; Deyton, Jordan H.
The Multiphysics Object-Oriented Simulation Environment (MOOSE) is a framework that facilitates the development of applications that rely on finite-element analysis to solve a coupled, nonlinear system of partial differential equations. RELAP-7 represents an update to the venerable RELAP-5 simulator that is built upon this framework and attempts to model the balance-of-plant concerns in a full nuclear plant. This report details the continued support and integration of RELAP-7 and the NEAMS Integrated Computational Environment (NiCE). RELAP-7 is fully supported by the NiCE due to on-going work to tightly integrate NiCE with the MOOSE framework, and subsequently the applications built upon it.more » NiCE development throughout the first quarter of FY15 has focused on improvements, bug fixes, and feature additions to existing MOOSE-based application support. Specifically, this report will focus on improvements to the NiCE MOOSE Model Builder, the MOOSE application job launcher, and the 3D Nuclear Plant Viewer. This report also includes a comprehensive tutorial that guides RELAP-7 users through the basic NiCE workflow: from input generation and 3D Plant modeling, to massively parallel job launch and post-simulation data visualization.« less
A data-driven dynamics simulation framework for railway vehicles
NASA Astrophysics Data System (ADS)
Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2018-03-01
The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.
Software Geometry in Simulations
NASA Astrophysics Data System (ADS)
Alion, Tyler; Viren, Brett; Junk, Tom
2015-04-01
The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).
The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E
2011-11-01
High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels ofmore » parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.« less
ERIC Educational Resources Information Center
Ma, Tingting; Brown, Irving A.; Kulm, Gerald; Davis, Trina J.; Lewis, Chance W.; Allen, G. Donald
2016-01-01
From the perspectives of Graduate Research Assistants (GRAs), this study examines the design and implementation of a simulated teaching environment in "Second Life" (SL) for prospective teachers to teach algebra for diverse learners. Drawing upon the Learning-for-Use framework, the analyses provide evidence on the development of student…
Validation of the OpCost logging cost model using contractor surveys
Conor K. Bell; Robert F. Keefe; Jeremy S. Fried
2017-01-01
OpCost is a harvest and fuel treatment operations cost model developed to function as both a standalone tool and an integrated component of the Bioregional Inventory Originated Simulation Under Management (BioSum) analytical framework for landscape-level analysis of forest management alternatives. OpCost is an updated implementation of the Fuel Reduction Cost Simulator...
Learning Theory Foundations of Simulation-Based Mastery Learning.
McGaghie, William C; Harris, Ilene B
2018-06-01
Simulation-based mastery learning (SBML), like all education interventions, has learning theory foundations. Recognition and comprehension of SBML learning theory foundations are essential for thoughtful education program development, research, and scholarship. We begin with a description of SBML followed by a section on the importance of learning theory foundations to shape and direct SBML education and research. We then discuss three principal learning theory conceptual frameworks that are associated with SBML-behavioral, constructivist, social cognitive-and their contributions to SBML thought and practice. We then discuss how the three learning theory frameworks converge in the course of planning, conducting, and evaluating SBML education programs in the health professions. Convergence of these learning theory frameworks is illustrated by a description of an SBML education and research program in advanced cardiac life support. We conclude with a brief coda.
Structure simulation with calculated NMR parameters - integrating COSMOS into the CCPN framework.
Schneider, Olaf; Fogh, Rasmus H; Sternberg, Ulrich; Klenin, Konstantin; Kondov, Ivan
2012-01-01
The Collaborative Computing Project for NMR (CCPN) has build a software framework consisting of the CCPN data model (with APIs) for NMR related data, the CcpNmr Analysis program and additional tools like CcpNmr FormatConverter. The open architecture allows for the integration of external software to extend the abilities of the CCPN framework with additional calculation methods. Recently, we have carried out the first steps for integrating our software Computer Simulation of Molecular Structures (COSMOS) into the CCPN framework. The COSMOS-NMR force field unites quantum chemical routines for the calculation of molecular properties with a molecular mechanics force field yielding the relative molecular energies. COSMOS-NMR allows introducing NMR parameters as constraints into molecular mechanics calculations. The resulting infrastructure will be made available for the NMR community. As a first application we have tested the evaluation of calculated protein structures using COSMOS-derived 13C Cα and Cβ chemical shifts. In this paper we give an overview of the methodology and a roadmap for future developments and applications.
Modeling spray drift and runoff-related inputs of pesticides to receiving water.
Zhang, Xuyang; Luo, Yuzhou; Goh, Kean S
2018-03-01
Pesticides move to surface water via various pathways including surface runoff, spray drift and subsurface flow. Little is known about the relative contributions of surface runoff and spray drift in agricultural watersheds. This study develops a modeling framework to address the contribution of spray drift to the total loadings of pesticides in receiving water bodies. The modeling framework consists of a GIS module for identifying drift potential, the AgDRIFT model for simulating spray drift, and the Soil and Water Assessment Tool (SWAT) for simulating various hydrological and landscape processes including surface runoff and transport of pesticides. The modeling framework was applied on the Orestimba Creek Watershed, California. Monitoring data collected from daily samples were used for model evaluation. Pesticide mass deposition on the Orestimba Creek ranged from 0.08 to 6.09% of applied mass. Monitoring data suggests that surface runoff was the major pathway for pesticide entering water bodies, accounting for 76% of the annual loading; the rest 24% from spray drift. The results from the modeling framework showed 81 and 19%, respectively, for runoff and spray drift. Spray drift contributed over half of the mass loading during summer months. The slightly lower spray drift contribution as predicted by the modeling framework was mainly due to SWAT's under-prediction of pesticide mass loading during summer and over-prediction of the loading during winter. Although model simulations were associated with various sources of uncertainties, the overall performance of the modeling framework was satisfactory as evaluated by multiple statistics: for simulation of daily flow, the Nash-Sutcliffe Efficiency Coefficient (NSE) ranged from 0.61 to 0.74 and the percent bias (PBIAS) < 28%; for daily pesticide loading, NSE = 0.18 and PBIAS = -1.6%. This modeling framework will be useful for assessing the relative exposure from pesticides related to spray drift and runoff in receiving waters and the design of management practices for mitigating pesticide exposure within a watershed. Published by Elsevier Ltd.
Breaking Down Chemical Weapons by Metal-Organic Frameworks.
Mondal, Suvendu Sekhar; Holdt, Hans-Jürgen
2016-01-04
Seek and destroy: Filtration schemes and self-detoxifying protective fabrics based on the Zr(IV)-containing metal-organic frameworks (MOFs) MOF-808 and UiO-66 doped with LiOtBu have been developed that capture and hydrolytically detoxify simulants of nerve agents and mustard gas. Both MOFs function as highly catalytic elements in these applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
KC-135 Simulator Systems Engineering Case Study
2010-01-01
performance. The utilization and misutilization of SE principles are highlighted, with special emphasis on the conditions that foster and impede...process, from the identification of the need to the development and utilization of the product, must continuously integrate and optimize system and... utilizing the Friedman-Sage framework to organize the assessment of the application of the SE process. The framework and the derived matrix can
2006-06-01
dynamic programming approach known as a “rolling horizon” approach. This method accounts for state transitions within the simulation rather than modeling ... model is based on the framework developed for Dynamic Allocation of Fires and Sensors used to evaluate factors associated with networking assets in the...of UAVs required by all types of maneuver and support brigades. (Witsken, 2004) The Modeling , Virtual Environments, and Simulations Institute
2011-07-01
Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended
FERN - a Java framework for stochastic simulation and evaluation of reaction networks.
Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf
2008-08-29
Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.
Van De Gucht, Tim; Saeys, Wouter; Van Meensel, Jef; Van Nuffel, Annelies; Vangeyte, Jurgen; Lauwers, Ludwig
2018-01-01
Although prototypes of automatic lameness detection systems for dairy cattle exist, information about their economic value is lacking. In this paper, a conceptual and operational framework for simulating the farm-specific economic value of automatic lameness detection systems was developed and tested on 4 system types: walkover pressure plates, walkover pressure mats, camera systems, and accelerometers. The conceptual framework maps essential factors that determine economic value (e.g., lameness prevalence, incidence and duration, lameness costs, detection performance, and their relationships). The operational simulation model links treatment costs and avoided losses with detection results and farm-specific information, such as herd size and lameness status. Results show that detection performance, herd size, discount rate, and system lifespan have a large influence on economic value. In addition, lameness prevalence influences the economic value, stressing the importance of an adequate prior estimation of the on-farm prevalence. The simulations provide first estimates for the upper limits for purchase prices of automatic detection systems. The framework allowed for identification of knowledge gaps obstructing more accurate economic value estimation. These include insights in cost reductions due to early detection and treatment, and links between specific lameness causes and their related losses. Because this model provides insight in the trade-offs between automatic detection systems' performance and investment price, it is a valuable tool to guide future research and developments. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bezawada, Rajesh; Uijt de Haag, Maarten
2010-04-01
This paper discusses the results of an initial evaluation study of hazard and integrity monitor functions for use with integrated alerting and notification. The Hazard and Integrity Monitor (HIM) (i) allocates information sources within the Integrated Intelligent Flight Deck (IIFD) to required functionality (like conflict detection and avoidance) and determines required performance of these information sources as part of that function; (ii) monitors or evaluates the required performance of the individual information sources and performs consistency checks among various information sources; (iii) integrates the information to establish tracks of potential hazards that can be used for the conflict probes or conflict prediction for various time horizons including the 10, 5, 3, and <3 minutes used in our scenario; (iv) detects and assesses the class of the hazard and provide possible resolutions. The HIM monitors the operation-dependent performance parameters related to the potential hazards in a manner similar to the Required Navigation Performance (RNP). Various HIM concepts have been implemented and evaluated using a previously developed sensor simulator/synthesizer. Within the simulation framework, various inputs to the IIFD and its subsystems are simulated, synthesized from actual collected data, or played back from actual flight test sensor data. The framework and HIM functions are implemented in SimulinkR, a modeling language developed by The MathworksTM. This modeling language allows for test and evaluation of various sensor and communication link configurations as well as the inclusion of feedback from the pilot on the performance of the aircraft.
Signature modelling and radiometric rendering equations in infrared scene simulation systems
NASA Astrophysics Data System (ADS)
Willers, Cornelius J.; Willers, Maria S.; Lapierre, Fabian
2011-11-01
The development and optimisation of modern infrared systems necessitates the use of simulation systems to create radiometrically realistic representations (e.g. images) of infrared scenes. Such simulation systems are used in signature prediction, the development of surveillance and missile sensors, signal/image processing algorithm development and aircraft self-protection countermeasure system development and evaluation. Even the most cursory investigation reveals a multitude of factors affecting the infrared signatures of realworld objects. Factors such as spectral emissivity, spatial/volumetric radiance distribution, specular reflection, reflected direct sunlight, reflected ambient light, atmospheric degradation and more, all affect the presentation of an object's instantaneous signature. The signature is furthermore dynamically varying as a result of internal and external influences on the object, resulting from the heat balance comprising insolation, internal heat sources, aerodynamic heating (airborne objects), conduction, convection and radiation. In order to accurately render the object's signature in a computer simulation, the rendering equations must therefore account for all the elements of the signature. In this overview paper, the signature models, rendering equations and application frameworks of three infrared simulation systems are reviewed and compared. The paper first considers the problem of infrared scene simulation in a framework for simulation validation. This approach provides concise definitions and a convenient context for considering signature models and subsequent computer implementation. The primary radiometric requirements for an infrared scene simulator are presented next. The signature models and rendering equations implemented in OSMOSIS (Belgian Royal Military Academy), DIRSIG (Rochester Institute of Technology) and OSSIM (CSIR & Denel Dynamics) are reviewed. In spite of these three simulation systems' different application focus areas, their underlying physics-based approach is similar. The commonalities and differences between the different systems are investigated, in the context of their somewhat different application areas. The application of an infrared scene simulation system towards the development of imaging missiles and missile countermeasures are briefly described. Flowing from the review of the available models and equations, recommendations are made to further enhance and improve the signature models and rendering equations in infrared scene simulators.
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.; ...
2018-04-07
The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. Here, these models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH 3BH 3) and endothermic alane (AlH 3).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.
The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. Here, these models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH 3BH 3) and endothermic alane (AlH 3).« less
Zhai, Di-Hua; Xia, Yuanqing
2018-02-01
This paper addresses the adaptive control for task-space teleoperation systems with constrained predefined synchronization error, where a novel switched control framework is investigated. Based on multiple Lyapunov-Krasovskii functionals method, the stability of the resulting closed-loop system is established in the sense of state-independent input-to-output stability. Compared with previous work, the developed method can simultaneously handle the unknown kinematics/dynamics, asymmetric varying time delays, and prescribed performance control in a unified framework. It is shown that the developed controller can guarantee the prescribed transient-state and steady-state synchronization performances between the master and slave robots, which is demonstrated by the simulation study.
Aorta: a management layer for mobile peer-to-peer massive multiplayer games
NASA Astrophysics Data System (ADS)
Edlich, Stefan; Hoerning, Henrik; Brunnert, Andreas; Hoerning, Reidar
2005-03-01
The development of massive multiplayer games (MMPGs) for personal computers is based on a wide range of frameworks and technologies. In contrast, MMPG development for cell phones lacks the availability of framework support. We present Aorta as a multi-purpose lightweight MIDP 2.0 framework to support the transparent and equal API usage of peer-to-peer communication via http, IP and Bluetooth. Special experiments, such as load-tests on Nokia 6600s, have been carried out with Bluetooth support in using a server-as-client architecture to create ad-hoc networks by using piconet functionalities. Additionally, scatternet functionalities, which will be supported in upcoming devices, have been tested in a simulated environment on more than 12 cell phones. The core of the Aorta framework is the Etherlobby, which manages connections, peers, the game lobby, game policies and much more. The framework itself was developed to enable the fast development of mobile games, regardless of the distance between users, which might be within the schoolyard or much further away. The earliest market-ready application shown here is a multimedia game for cell phones utilizing all of the frameworks features. This game, called Micromonster, acts as platform for developer tests, as well as providing valuable information about interface usability and user acceptance.
Duan, J; Kesisoglou, F; Novakovic, J; Amidon, GL; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-01-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled “Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation.”1 The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole‐body framework.2 PMID:28571121
Development of the CELSS emulator at NASA. Johnson Space Center
NASA Technical Reports Server (NTRS)
Cullingford, Hatice S.
1990-01-01
The Closed Ecological Life Support System (CELSS) Emulator is under development. It will be used to investigate computer simulations of integrated CELSS operations involving humans, plants, and process machinery. Described here is Version 1.0 of the CELSS Emulator that was initiated in 1988 on the Johnson Space Center (JSC) Multi Purpose Applications Console Test Bed as the simulation framework. The run model of the simulation system now contains a CELSS model called BLSS. The CELSS simulator empowers us to generate model data sets, store libraries of results for further analysis, and also display plots of model variables as a function of time. The progress of the project is presented with sample test runs and simulation display pages.
COMPUTERIZED TRAINING OF CRYOSURGERY – A SYSTEM APPROACH
Keelan, Robert; Yamakawa, Soji; Shimada, Kenji; Rabin, Yoed
2014-01-01
The objective of the current study is to provide the foundation for a computerized training platform for cryosurgery. Consistent with clinical practice, the training process targets the correlation of the frozen region contour with the target region shape, using medical imaging and accepted criteria for clinical success. The current study focuses on system design considerations, including a bioheat transfer model, simulation techniques, optimal cryoprobe layout strategy, and a simulation core framework. Two fundamentally different approaches were considered for the development of a cryosurgery simulator, based on a finite-elements (FE) commercial code (ANSYS) and a proprietary finite-difference (FD) code. Results of this study demonstrate that the FE simulator is superior in terms of geometric modeling, while the FD simulator is superior in terms of runtime. Benchmarking results further indicate that the FD simulator is superior in terms of usage of memory resources, pre-processing, parallel processing, and post-processing. It is envisioned that future integration of a human-interface module and clinical data into the proposed computer framework will make computerized training of cryosurgery a practical reality. PMID:23995400
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan S; Krishnamurthy, Dheepak; Top, Philip
This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layeredmore » co-simulation architecture.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan S; Krishnamurthy, Dheepak; Top, Philip
This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layeredmore » co-simulation architecture.« less
Operational framework for quantum measurement simulability
NASA Astrophysics Data System (ADS)
Guerini, Leonardo; Bavaresco, Jessica; Terra Cunha, Marcelo; Acín, Antonio
2017-09-01
We introduce a framework for simulating quantum measurements based on classical processing of a set of accessible measurements. Well-known concepts such as joint measurability and projective simulability naturally emerge as particular cases of our framework, but our study also leads to novel results and questions. First, a generalisation of joint measurability is derived, which yields a hierarchy for the incompatibility of sets of measurements. A similar hierarchy is defined based on the number of outcomes necessary to perform a simulation of a given measurement. This general approach also allows us to identify connections between different kinds of simulability and, in particular, we characterise the qubit measurements that are projective-simulable in terms of joint measurability. Finally, we discuss how our framework can be interpreted in the context of resource theories.
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
Larocque, Guy R.; Bhatti, Jagtar S.; Ascough, J.C.; Liu, J.; Luckai, N.; Mailly, D.; Archambault, L.; Gordon, Andrew M.
2011-01-01
The predictions from most forest ecosystem models originate from deterministic simulations. However, few evaluation exercises for model outputs are performed by either model developers or users. This issue has important consequences for decision makers using these models to develop natural resource management policies, as they cannot evaluate the extent to which predictions stemming from the simulation of alternative management scenarios may result in significant environmental or economic differences. Various numerical methods, such as sensitivity/uncertainty analyses, or bootstrap methods, may be used to evaluate models and the errors associated with their outputs. However, the application of each of these methods carries unique challenges which decision makers do not necessarily understand; guidance is required when interpreting the output generated from each model. This paper proposes a decision flow chart in the form of an analytical framework to help decision makers apply, in an orderly fashion, different steps involved in examining the model outputs. The analytical framework is discussed with regard to the definition of problems and objectives and includes the following topics: model selection, identification of alternatives, modelling tasks and selecting alternatives for developing policy or implementing management scenarios. Its application is illustrated using an on-going exercise in developing silvicultural guidelines for a forest management enterprise in Ontario, Canada.
Brewer, Zachary E; Ogden, William David; Fann, James I; Burdon, Thomas A; Sheikh, Ahmad Y
Several modern learning frameworks (eg, cognitive apprenticeship, anchored instruction, and situated cognition) posit the utility of nontraditional methods for effective experiential learning. Thus, development of novel educational tools emphasizing the cognitive framework of operative sequences may be of benefit to surgical trainees. We propose the development and global deployment of an effective, mobile cognitive cardiac surgical simulator. In methods, 16 preclinical medical students were assessed. Overall, 4 separate surgical modules (sternotomy, cannulation, decannulation, and sternal closure) were created utilizing the Touch Surgery (London, UK) platform. Modules were made available to download free of charge for use on mobile devices. Usage data were collected over a 6-month period. Educational efficacy of the modules was evaluated by randomizing a cohort of medical students to either module usage or traditional, reading-based self-study, followed by a multiple-choice learning assessment tool. In results, downloads of the simulator achieved global penetrance, with highest usage in the USA, Brazil, Italy, UK, and India. Overall, 5368 unique users conducted a total of 1971 hours of simulation. Evaluation of the medical student cohort revealed significantly higher assessment scores in those randomized to module use versus traditional reading (75% ± 9% vs 61% ± 7%, respectively; P < 0.05). In conclusion, this study represents the first effort to create a mobile, interactive cognitive simulator for cardiac surgery. Simulators of this type may be effective for the training and assessment of surgical students. We investigated whether an interactive, mobile-computing-based cognitive task simulator for cardiac surgery could be developed, deployed, and validated. Our findings suggest that such simulators may be a useful learning tool. Copyright © 2016. Published by Elsevier Inc.
Virtual Simulations: A Creative, Evidence-Based Approach to Develop and Educate Nurses.
Leibold, Nancyruth; Schwarz, Laura
2017-02-01
The use of virtual simulations in nursing is an innovative strategy that is increasing in application. There are several terms related to virtual simulation; although some are used interchangeably, the meanings are not the same. This article presents examples of virtual simulation, virtual worlds, and virtual patients in continuing education, staff development, and academic nursing education. Virtual simulations in nursing use technology to provide safe, as realistic as possible clinical practice for nurses and nursing students. Virtual simulations are useful for learning new skills; practicing a skill that puts content, high-order thinking, and psychomotor elements together; skill competency learning; and assessment for low-volume, high-risk skills. The purpose of this article is to describe the related terms, examples, uses, theoretical frameworks, challenges, and evidence related to virtual simulations in nursing.
A flexible object-oriented software framework for developing complex multimedia simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sydelko, P. J.; Dolph, J. E.; Christiansen, J. H.
Decision makers involved in brownfields redevelopment and long-term stewardship must consider environmental conditions, future-use potential, site ownership, area infrastructure, funding resources, cost recovery, regulations, risk and liability management, community relations, and expected return on investment in a comprehensive and integrated fashion to achieve desired results. Successful brownfields redevelopment requires the ability to assess the impacts of redevelopment options on multiple interrelated aspects of the ecosystem, both natural and societal. Computer-based tools, such as simulation models, databases, and geographical information systems (GISs) can be used to address brownfields planning and project execution. The transparent integration of these tools into a comprehensivemore » and dynamic decision support system would greatly enhance the brownfields assessment process. Such a system needs to be able to adapt to shifting and expanding analytical requirements and contexts. The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-oriented framework for developing and maintaining complex multidisciplinary simulations of a wide variety of application domains. The modeling domain of a specific DIAS-based simulation is determined by (1) software objects that represent the real-world entities that comprise the problem space (atmosphere, watershed, human), and (2) simulation models and other data processing applications that express the dynamic behaviors of the domain entities. Models and applications used to express dynamic behaviors can be either internal or external to DIAS, including existing legacy models written in various languages (FORTRAN, C, etc.). The flexible design framework of DIAS makes the objects adjustable to the context of the problem without a great deal of recoding. The DIAS Spatial Data Set facility allows parameters to vary spatially depending on the simulation context according to any of a number of 1-D, 2-D, or 3-D topologies. DIAS is also capable of interacting with other GIS packages and can import many standard spatial data formats. DIAS simulation capabilities can also be extended by including societal process models. Models that implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations allow for interaction and feedback among natural and societal processes. The ability to simulate the complex interplay of multimedia processes makes DIAS a promising tool for constructing applications for comprehensive community planning, including the assessment of multiple development and redevelopment scenarios.« less
NASA Astrophysics Data System (ADS)
Koo, J.; Wood, S.; Cenacchi, N.; Fisher, M.; Cox, C.
2012-12-01
HarvestChoice (harvestchoice.org) generates knowledge products to guide strategic investments to improve the productivity and profitability of smallholder farming systems in sub-Saharan Africa (SSA). A keynote component of the HarvestChoice analytical framework is a grid-based overlay of SSA - a cropping simulation platform powered by process-based, crop models. Calibrated around the best available representation of cropping production systems in SSA, the simulation platform engages the DSSAT Crop Systems Model with the CENTURY Soil Organic Matter model (DSSAT-CENTURY) and provides a virtual experimentation module with which to explore the impact of a range of technological, managerial and environmental metrics on future crop productivity and profitability, as well as input use. For each of 5 (or 30) arc-minute grid cells in SSA, a stack of model input underlies it: datasets that cover soil properties and fertility, historic and future climate scenarios and farmers' management practices; all compiled from analyses of existing global and regional databases and consultations with other CGIAR centers. Running a simulation model is not always straightforward, especially when certain cropping systems or management practices are not even practiced by resource-poor farmers yet (e.g., precision agriculture) or they were never included in the existing simulation framework (e.g., water harvesting). In such cases, we used DSSAT-CENTURY as a function to iteratively estimate relative responses of cropping systems to technology-driven changes in water and nutrient balances compared to zero-adoption by farmers, while adjusting model input parameters to best mimic farmers' implementation of technologies in the field. We then fed the results of the simulation into to the economic and food trade model framework, IMPACT, to assess the potential implications on future food security. The outputs of the overall simulation analyses are packaged as a web-accessible database and published on the web with an interface that allows users to explore the simulation results in each country with user-defined baseline and what-if scenarios. The results are dynamically presented on maps, charts, and tables. This paper discusses the development of the simulation platform and its underlying data layers, a case study that assessed the role of potential crop management technology development, and the development of a web-based application that visualizes the simulation results.
Robust Real-Time Musculoskeletal Modeling Driven by Electromyograms.
Durandau, Guillaume; Farina, Dario; Sartori, Massimo
2018-03-01
Current clinical biomechanics involves lengthy data acquisition and time-consuming offline analyses with biomechanical models not operating in real-time for man-machine interfacing. We developed a method that enables online analysis of neuromusculoskeletal function in vivo in the intact human. We used electromyography (EMG)-driven musculoskeletal modeling to simulate all transformations from muscle excitation onset (EMGs) to mechanical moment production around multiple lower-limb degrees of freedom (DOFs). We developed a calibration algorithm that enables adjusting musculoskeletal model parameters specifically to an individual's anthropometry and force-generating capacity. We incorporated the modeling paradigm into a computationally efficient, generic framework that can be interfaced in real-time with any movement data collection system. The framework demonstrated the ability of computing forces in 13 lower-limb muscle-tendon units and resulting moments about three joint DOFs simultaneously in real-time. Remarkably, it was capable of extrapolating beyond calibration conditions, i.e., predicting accurate joint moments during six unseen tasks and one unseen DOF. The proposed framework can dramatically reduce evaluation latency in current clinical biomechanics and open up new avenues for establishing prompt and personalized treatments, as well as for establishing natural interfaces between patients and rehabilitation systems. The integration of EMG with numerical modeling will enable simulating realistic neuromuscular strategies in conditions including muscular/orthopedic deficit, which could not be robustly simulated via pure modeling formulations. This will enable translation to clinical settings and development of healthcare technologies including real-time bio-feedback of internal mechanical forces and direct patient-machine interfacing.
Smsynth: AN Imagery Synthesis System for Soil Moisture Retrieval
NASA Astrophysics Data System (ADS)
Cao, Y.; Xu, L.; Peng, J.
2018-04-01
Soil moisture (SM) is a important variable in various research areas, such as weather and climate forecasting, agriculture, drought and flood monitoring and prediction, and human health. An ongoing challenge in estimating SM via synthetic aperture radar (SAR) is the development of the retrieval SM methods, especially the empirical models needs as training samples a lot of measurements of SM and soil roughness parameters which are very difficult to acquire. As such, it is difficult to develop empirical models using realistic SAR imagery and it is necessary to develop methods to synthesis SAR imagery. To tackle this issue, a SAR imagery synthesis system based on the SM named SMSynth is presented, which can simulate radar signals that are realistic as far as possible to the real SAR imagery. In SMSynth, SAR backscatter coefficients for each soil type are simulated via the Oh model under the Bayesian framework, where the spatial correlation is modeled by the Markov random field (MRF) model. The backscattering coefficients simulated based on the designed soil parameters and sensor parameters are added into the Bayesian framework through the data likelihood where the soil parameters and sensor parameters are set as realistic as possible to the circumstances on the ground and in the validity range of the Oh model. In this way, a complete and coherent Bayesian probabilistic framework is established. Experimental results show that SMSynth is capable of generating realistic SAR images that suit the needs of a large amount of training samples of empirical models.
NASA Technical Reports Server (NTRS)
Pace, Dale K.
2000-01-01
A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.
ERIC Educational Resources Information Center
Shen, Hao-Yu; Shen, Bo; Hardacre, Christopher
2013-01-01
A systematic approach to develop the teaching of instrumental analytical chemistry is discussed, as well as a conceptual framework for organizing and executing lectures and a laboratory course. Three main components are used in this course: theoretical knowledge developed in the classroom, simulations via a virtual laboratory, and practical…
The Effects of a Dynamic Spectrum Access Overlay in LTE-Advanced Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juan D. Deaton; Ryan E. lrwin; Luiz A. DaSilva
As early as 2014, wireless network operators spectral capacity will be overwhelmed by a data tsunami brought on by new devices and applications. To augment spectral capacity, operators could deploy a Dynamic Spectrum Access (DSA) overlay. In the light of the many planned Long Term Evolution (LTE) network deployments, the affects of a DSA overlay have not been fully considered into the existing LTE standards. Coalescing many different aspects of DSA, this paper develops the Spectrum Accountability (SA) framework. The SA framework defines specific network element functionality, protocol interfaces, and signaling flow diagrams for LTE to support service requests andmore » enforce rights of responsibilities of primary and secondary users, respectively. We also include a network simulation to quantify the benefits of using DSA channels to augment capacity. Based on our simulation we show that, network operators can benefit up to %40 increase in operating capacity when sharing DSA bands to augment spectral capacity. With our framework, this paper could serve as an guide in developing future LTE network standards that include DSA.« less
Zhou, Xiangmin; Zhang, Nan; Sha, Desong; Shen, Yunhe; Tamma, Kumar K; Sweet, Robert
2009-01-01
The inability to render realistic soft-tissue behavior in real time has remained a barrier to face and content aspects of validity for many virtual reality surgical training systems. Biophysically based models are not only suitable for training purposes but also for patient-specific clinical applications, physiological modeling and surgical planning. When considering the existing approaches for modeling soft tissue for virtual reality surgical simulation, the computer graphics-based approach lacks predictive capability; the mass-spring model (MSM) based approach lacks biophysically realistic soft-tissue dynamic behavior; and the finite element method (FEM) approaches fail to meet the real-time requirement. The present development stems from physics fundamental thermodynamic first law; for a space discrete dynamic system directly formulates the space discrete but time continuous governing equation with embedded material constitutive relation and results in a discrete mechanics framework which possesses a unique balance between the computational efforts and the physically realistic soft-tissue dynamic behavior. We describe the development of the discrete mechanics framework with focused attention towards a virtual laparoscopic nephrectomy application.
Framework of passive millimeter-wave scene simulation based on material classification
NASA Astrophysics Data System (ADS)
Park, Hyuk; Kim, Sung-Hyun; Lee, Ho-Jin; Kim, Yong-Hoon; Ki, Jae-Sug; Yoon, In-Bok; Lee, Jung-Min; Park, Soon-Jun
2006-05-01
Over the past few decades, passive millimeter-wave (PMMW) sensors have emerged as useful implements in transportation and military applications such as autonomous flight-landing system, smart weapons, night- and all weather vision system. As an efficient way to predict the performance of a PMMW sensor and apply it to system, it is required to test in SoftWare-In-the-Loop (SWIL). The PMMW scene simulation is a key component for implementation of this simulator. However, there is no commercial on-the-shelf available to construct the PMMW scene simulation; only there have been a few studies on this technology. We have studied the PMMW scene simulation method to develop the PMMW sensor SWIL simulator. This paper describes the framework of the PMMW scene simulation and the tentative results. The purpose of the PMMW scene simulation is to generate sensor outputs (or image) from a visible image and environmental conditions. We organize it into four parts; material classification mapping, PMMW environmental setting, PMMW scene forming, and millimeter-wave (MMW) sensorworks. The background and the objects in the scene are classified based on properties related with MMW radiation and reflectivity. The environmental setting part calculates the following PMMW phenomenology; atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Then, PMMW raw images are formed with surface geometry. Finally, PMMW sensor outputs are generated from PMMW raw images by applying the sensor characteristics such as an aperture size and noise level. Through the simulation process, PMMW phenomenology and sensor characteristics are simulated on the output scene. We have finished the design of framework of the simulator, and are working on implementation in detail. As a tentative result, the flight observation was simulated in specific conditions. After implementation details, we plan to increase the reliability of the simulation by data collecting using actual PMMW sensors. With the reliable PMMW scene simulator, it will be more efficient to apply the PMMW sensor to various applications.
G. Sun; C. Li; C. Tretting; J. Lu; S.G. McNulty
2005-01-01
A modeling framework (Wetland-DNDC) that described forested wetland ecosystem processes has been developed and validated with data from North America and Europe. The model simulates forest photosynthesis, respiration, carbon allocation, and liter production, soil organic matter (SOM) turnover, trace gas emissions, and N leaching. Inputs required by Wetland-DNDC...
Ethics skills laboratory experience for surgery interns.
Moon, Margaret R; Hughes, Mark T; Chen, Jiin-Yu; Khaira, Kiran; Lipsett, Pamela; Carrese, Joseph A
2014-01-01
Ethics curricula are nearly universal in residency training programs, but the content and delivery methods are not well described, and there is still a relative paucity of literature evaluating the effect of ethics curricula. Several commentators have called for more ethics curriculum development at the postgraduate level, and specifically in surgery training. We detail our development and implementation of a clinical ethics curriculum for surgery interns. We developed curricula and simulated patient cases for 2 core clinical ethics skills--breaking bad news and obtaining informed consent. Educational sessions for each topic included (1) framework development (discussion of interns' current experience, development of a consensus framework for ethical practice, and comparison with established frameworks) and (2) practice with simulated patient followed by peer and faculty feedback. At the beginning and end of each session, we administered a test of confidence and knowledge about the topics to assess the effect of the sessions. A total of 98 surgical interns participated in the ethics skills laboratory from Spring 2008 to Spring 2011. We identified significant improvement in confidence regarding the appropriate content of informed consent (<0.001) and capacity to break bad news (<0.001). We also identified significant improvement in overall knowledge regarding informed consent (<0.01), capacity assessment (<0.05), and breaking bad news (0.001). Regarding specific components of informed consent, capacity assessment, and breaking bad news, significant improvement was shown in some areas, while we failed to improve knowledge in others. Through faculty-facilitated small group discussion, surgery interns were able to develop frameworks for ethical practice that paralleled established frameworks. Skills-based training in clinical ethics resulted in an increase in knowledge scores and self-reported confidence. Evaluation of 4 annual cohorts of surgery interns demonstrates significant successes and some areas for improvement in this educational intervention. Copyright © 2014 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NetMOD Version 2.0 Mathematical Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merchant, Bion J.; Young, Christopher J.; Chael, Eric P.
2015-08-01
NetMOD ( Net work M onitoring for O ptimal D etection) is a Java-based software package for conducting simulation of seismic, hydroacoustic and infrasonic networks. Network simulations have long been used to study network resilience to station outages and to determine where additional stations are needed to reduce monitoring thresholds. NetMOD makes use of geophysical models to determine the source characteristics, signal attenuation along the path between the source and station, and the performance and noise properties of the station. These geophysical models are combined to simulate the relative amplitudes of signal and noise that are observed at each ofmore » the stations. From these signal-to-noise ratios (SNR), the probabilities of signal detection at each station and event detection across the network of stations can be computed given a detection threshold. The purpose of this document is to clearly and comprehensively present the mathematical framework used by NetMOD, the software package developed by Sandia National Laboratories to assess the monitoring capability of ground-based sensor networks. Many of the NetMOD equations used for simulations are inherited from the NetSim network capability assessment package developed in the late 1980s by SAIC (Sereno et al., 1990).« less
Mesh infrastructure for coupled multiprocess geophysical simulations
Garimella, Rao V.; Perkins, William A.; Buksas, Mike W.; ...
2014-01-01
We have developed a sophisticated mesh infrastructure capability to support large scale multiphysics simulations such as subsurface flow and reactive contaminant transport at storage sites as well as the analysis of the effects of a warming climate on the terrestrial arctic. These simulations involve a wide range of coupled processes including overland flow, subsurface flow, freezing and thawing of ice rich soil, accumulation, redistribution and melting of snow, biogeochemical processes involving plant matter and finally, microtopography evolution due to melting and degradation of ice wedges below the surface. In addition to supporting the usual topological and geometric queries about themore » mesh, the mesh infrastructure adds capabilities such as identifying columnar structures in the mesh, enabling deforming of the mesh subject to constraints and enabling the simultaneous use of meshes of different dimensionality for subsurface and surface processes. The generic mesh interface is capable of using three different open source mesh frameworks (MSTK, MOAB and STKmesh) under the hood allowing the developers to directly compare them and choose one that is best suited for the application's needs. We demonstrate the results of some simulations using these capabilities as well as present a comparison of the performance of the different mesh frameworks.« less
The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models
NASA Technical Reports Server (NTRS)
Penn, John M.
2016-01-01
The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.
NASA Astrophysics Data System (ADS)
Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores
2011-12-01
With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.
NASA Astrophysics Data System (ADS)
Pattke, Marco; Martin, Manuel; Voit, Michael
2017-05-01
Tracking people with cameras in public areas is common today. However with an increasing number of cameras it becomes harder and harder to view the data manually. Especially in safety critical areas automatic image exploitation could help to solve this problem. Setting up such a system can however be difficult because of its increased complexity. Sensor placement is critical to ensure that people are detected and tracked reliably. We try to solve this problem using a simulation framework that is able to simulate different camera setups in the desired environment including animated characters. We combine this framework with our self developed distributed and scalable system for people tracking to test its effectiveness and can show the results of the tracking system in real time in the simulated environment.
Thermostating extended Lagrangian Born-Oppenheimer molecular dynamics.
Martínez, Enrique; Cawkwell, Marc J; Voter, Arthur F; Niklasson, Anders M N
2015-04-21
Extended Lagrangian Born-Oppenheimer molecular dynamics is developed and analyzed for applications in canonical (NVT) simulations. Three different approaches are considered: the Nosé and Andersen thermostats and Langevin dynamics. We have tested the temperature distribution under different conditions of self-consistent field (SCF) convergence and time step and compared the results to analytical predictions. We find that the simulations based on the extended Lagrangian Born-Oppenheimer framework provide accurate canonical distributions even under approximate SCF convergence, often requiring only a single diagonalization per time step, whereas regular Born-Oppenheimer formulations exhibit unphysical fluctuations unless a sufficiently high degree of convergence is reached at each time step. The thermostated extended Lagrangian framework thus offers an accurate approach to sample processes in the canonical ensemble at a fraction of the computational cost of regular Born-Oppenheimer molecular dynamics simulations.
NASA Technical Reports Server (NTRS)
Pomerantz, M. I.; Lim, C.; Myint, S.; Woodward, G.; Balaram, J.; Kuo, C.
2012-01-01
he Jet Propulsion Laboratory's Entry, Descent and Landing (EDL) Reconstruction Task has developed a software system that provides mission operations personnel and analysts with a real time telemetry-based live display, playback and post-EDL reconstruction capability that leverages the existing high-fidelity, physics-based simulation framework and modern game engine-derived 3D visualization system developed in the JPL Dynamics and Real Time Simulation (DARTS) Lab. Developed as a multi-mission solution, the EDL Telemetry Visualization (ETV) system has been used for a variety of projects including NASA's Mars Science Laboratory (MSL), NASA'S Low Density Supersonic Decelerator (LDSD) and JPL's MoonRise Lunar sample return proposal.
Normal Brain-Skull Development with Hybrid Deformable VR Models Simulation.
Jin, Jing; De Ribaupierre, Sandrine; Eagleson, Roy
2016-01-01
This paper describes a simulation framework for a clinical application involving skull-brain co-development in infants, leading to a platform for craniosynostosis modeling. Craniosynostosis occurs when one or more sutures are fused early in life, resulting in an abnormal skull shape. Surgery is required to reopen the suture and reduce intracranial pressure, but is difficult without any predictive model to assist surgical planning. We aim to study normal brain-skull growth by computer simulation, which requires a head model and appropriate mathematical methods for brain and skull growth respectively. On the basis of our previous model, we further specified suture model into fibrous and cartilaginous sutures and develop algorithm for skull extension. We evaluate the resulting simulation by comparison with datasets of cases and normal growth.
Asakura, Nobuhiko; Inui, Toshio
2016-01-01
Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities. PMID:28082941
Asakura, Nobuhiko; Inui, Toshio
2016-01-01
Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities.
System Simulation by Recursive Feedback: Coupling A Set of Stand-Alone Subsystem Simulations
NASA Technical Reports Server (NTRS)
Nixon, Douglas D.; Hanson, John M. (Technical Monitor)
2002-01-01
Recursive feedback is defined and discussed as a framework for development of specific algorithms and procedures that propagate the time-domain solution for a dynamical system simulation consisting of multiple numerically coupled self-contained stand-alone subsystem simulations. A satellite motion example containing three subsystems (other dynamics, attitude dynamics, and aerodynamics) has been defined and constructed using this approach. Conventional solution methods are used in the subsystem simulations. Centralized and distributed versions of coupling structure have been addressed. Numerical results are evaluated by direct comparison with a standard total-system simultaneous-solution approach.
Methodology development for evaluation of selective-fidelity rotorcraft simulation
NASA Technical Reports Server (NTRS)
Lewis, William D.; Schrage, D. P.; Prasad, J. V. R.; Wolfe, Daniel
1992-01-01
This paper addressed the initial step toward the goal of establishing performance and handling qualities acceptance criteria for realtime rotorcraft simulators through a planned research effort to quantify the system capabilities of 'selective fidelity' simulators. Within this framework the simulator is then classified based on the required task. The simulator is evaluated by separating the various subsystems (visual, motion, etc.) and applying corresponding fidelity constants based on the specific task. This methodology not only provides an assessment technique, but also provides a technique to determine the required levels of subsystem fidelity for a specific task.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neylon, J; Qi, S; Sheng, K
2014-06-15
Purpose: To develop a GPU-based framework that can generate highresolution and patient-specific biomechanical models from a given simulation CT and contoured structures, optimized to run at interactive speeds, for addressing adaptive radiotherapy objectives. Method: A Massspring-damping (MSD) model was generated from a given simulation CT. The model's mass elements were generated for every voxel of anatomy, and positioned in a deformation space in the GPU memory. MSD connections were established between neighboring mass elements in a dense distribution. Contoured internal structures allowed control over elastic material properties of different tissues. Once the model was initialized in GPU memory, skeletal anatomymore » was actuated using rigid-body transformations, while soft tissues were governed by elastic corrective forces and constraints, which included tensile forces, shear forces, and spring damping forces. The model was validated by applying a known load to a soft tissue block and comparing the observed deformation to ground truth calculations from established elastic mechanics. Results: Our analyses showed that both local and global load experiments yielded results with a correlation coefficient R{sup 2} > 0.98 compared to ground truth. Models were generated for several anatomical regions. Head and neck models accurately simulated posture changes by rotating the skeletal anatomy in three dimensions. Pelvic models were developed for realistic deformations for changes in bladder volume. Thoracic models demonstrated breast deformation due to gravity when changing treatment position from supine to prone. The GPU framework performed at greater than 30 iterations per second for over 1 million mass elements with up to 26 MSD connections each. Conclusions: Realistic simulations of site-specific, complex posture and physiological changes were simulated at interactive speeds using patient data. Incorporating such a model with live patient tracking would facilitate real time assessment of variations of the actual anatomy and delivered dose for adaptive intervention and re-planning.« less
Quantifying the drivers of ocean-atmosphere CO2 fluxes
NASA Astrophysics Data System (ADS)
Lauderdale, Jonathan M.; Dutkiewicz, Stephanie; Williams, Richard G.; Follows, Michael J.
2016-07-01
A mechanistic framework for quantitatively mapping the regional drivers of air-sea CO2 fluxes at a global scale is developed. The framework evaluates the interplay between (1) surface heat and freshwater fluxes that influence the potential saturated carbon concentration, which depends on changes in sea surface temperature, salinity and alkalinity, (2) a residual, disequilibrium flux influenced by upwelling and entrainment of remineralized carbon- and nutrient-rich waters from the ocean interior, as well as rapid subduction of surface waters, (3) carbon uptake and export by biological activity as both soft tissue and carbonate, and (4) the effect on surface carbon concentrations due to freshwater precipitation or evaporation. In a steady state simulation of a coarse-resolution ocean circulation and biogeochemistry model, the sum of the individually determined components is close to the known total flux of the simulation. The leading order balance, identified in different dynamical regimes, is between the CO2 fluxes driven by surface heat fluxes and a combination of biologically driven carbon uptake and disequilibrium-driven carbon outgassing. The framework is still able to reconstruct simulated fluxes when evaluated using monthly averaged data and takes a form that can be applied consistently in models of different complexity and observations of the ocean. In this way, the framework may reveal differences in the balance of drivers acting across an ensemble of climate model simulations or be applied to an analysis and interpretation of the observed, real-world air-sea flux of CO2.
a Simulation-As Framework Facilitating Webgis Based Installation Planning
NASA Astrophysics Data System (ADS)
Zheng, Z.; Chang, Z. Y.; Fei, Y. F.
2017-09-01
Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.
Efficient generation of connectivity in neuronal networks from simulator-independent descriptions
Djurfeldt, Mikael; Davison, Andrew P.; Eppler, Jochen M.
2014-01-01
Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed. We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeler's needs. We have used the connection generator interface to connect C++ and Python implementations of the previously described connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modeling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface. PMID:24795620
Smith, Morgan; Warland, Jane; Smith, Colleen
2012-03-01
Online role-play has the potential to actively engage students in authentic learning experiences and help develop their clinical reasoning skills. However, evaluation of student learning for this kind of simulation focuses mainly on the content and outcome of learning, rather than on the process of learning through student engagement. This article reports on the use of a student engagement framework to evaluate an online role-play offered as part of a course in Bachelor of Nursing and Bachelor of Midwifery programs. Instruments that measure student engagement to date have targeted large numbers of students at program and institutional levels, rather than at the level of a specific learning activity. Although the framework produced some useful findings for evaluation purposes, further refinement of the questions is required to be certain that deep learning results from the engagement that occurs with course-level learning initiatives. Copyright 2012, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Hanachi, Houman; Liu, Jie; Banerjee, Avisekh; Chen, Ying
2016-05-01
Health state estimation of inaccessible components in complex systems necessitates effective state estimation techniques using the observable variables of the system. The task becomes much complicated when the system is nonlinear/non-Gaussian and it receives stochastic input. In this work, a novel sequential state estimation framework is developed based on particle filtering (PF) scheme for state estimation of general class of nonlinear dynamical systems with stochastic input. Performance of the developed framework is then validated with simulation on a Bivariate Non-stationary Growth Model (BNGM) as a benchmark. In the next step, three-year operating data of an industrial gas turbine engine (GTE) are utilized to verify the effectiveness of the developed framework. A comprehensive thermodynamic model for the GTE is therefore developed to formulate the relation of the observable parameters and the dominant degradation symptoms of the turbine, namely, loss of isentropic efficiency and increase of the mass flow. The results confirm the effectiveness of the developed framework for simultaneous estimation of multiple degradation symptoms in complex systems with noisy measured inputs.
High performance cellular level agent-based simulation with FLAME for the GPU.
Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela
2010-05-01
Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.
Numerical simulation of the casting process of titanium removable partial denture frameworks.
Wu, Menghuai; Wagner, Ingo; Sahm, Peter R; Augthun, Michael
2002-03-01
The objective of this work was to study the filling incompleteness and porosity defects in titanium removal partial denture frameworks by means of numerical simulation. Two frameworks, one for lower jaw and one for upper jaw, were chosen according to dentists' recommendation to be simulated. Geometry of the frameworks were laser-digitized and converted into a simulation software (MAGMASOFT). Both mold filling and solidification of the castings with different sprue designs (e.g. tree, ball, and runner-bar) were numerically calculated. The shrinkage porosity was quantitatively predicted by a feeding criterion, the potential filling defect and gas pore sensitivity were estimated based on the filling and solidification results. A satisfactory sprue design with process parameters was finally recommended for real casting trials (four replica for each frameworks). All the frameworks were successfully cast. Through X-ray radiographic inspections it was found that all the castings were acceptably sound except for only one case in which gas bubbles were detected in the grasp region of the frame. It is concluded that numerical simulation aids to achieve understanding of the casting process and defect formation in titanium frameworks, hence to minimize the risk of producing defect casting by improving the sprue design and process parameters.
Enhancing 4D PC-MRI in an aortic phantom considering numerical simulations
NASA Astrophysics Data System (ADS)
Kratzke, Jonas; Schoch, Nicolai; Weis, Christian; Müller-Eschner, Matthias; Speidel, Stefanie; Farag, Mina; Beller, Carsten J.; Heuveline, Vincent
2015-03-01
To date, cardiovascular surgery enables the treatment of a wide range of aortic pathologies. One of the current challenges in this field is given by the detection of high-risk patients for adverse aortic events, who should be treated electively. Reliable diagnostic parameters, which indicate the urge of treatment, have to be determined. Functional imaging by means of 4D phase contrast-magnetic resonance imaging (PC-MRI) enables the time-resolved measurement of blood flow velocity in 3D. Applied to aortic phantoms, three dimensional blood flow properties and their relation to adverse dynamics can be investigated in vitro. Emerging "in silico" methods of numerical simulation can supplement these measurements in computing additional information on crucial parameters. We propose a framework that complements 4D PC-MRI imaging by means of numerical simulation based on the Finite Element Method (FEM). The framework is developed on the basis of a prototypic aortic phantom and validated by 4D PC-MRI measurements of the phantom. Based on physical principles of biomechanics, the derived simulation depicts aortic blood flow properties and characteristics. The framework might help identifying factors that induce aortic pathologies such as aortic dilatation or aortic dissection. Alarming thresholds of parameters such as wall shear stress distribution can be evaluated. The combined techniques of 4D PC-MRI and numerical simulation can be used as complementary tools for risk-stratification of aortic pathology.
A Coupled Earthquake-Tsunami Simulation Framework Applied to the Sumatra 2004 Event
NASA Astrophysics Data System (ADS)
Vater, Stefan; Bader, Michael; Behrens, Jörn; van Dinther, Ylona; Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Uphoff, Carsten; Wollherr, Stephanie; van Zelst, Iris
2017-04-01
Large earthquakes along subduction zone interfaces have generated destructive tsunamis near Chile in 1960, Sumatra in 2004, and northeast Japan in 2011. In order to better understand these extreme events, we have developed tools for physics-based, coupled earthquake-tsunami simulations. This simulation framework is applied to the 2004 Indian Ocean M 9.1-9.3 earthquake and tsunami, a devastating event that resulted in the loss of more than 230,000 lives. The earthquake rupture simulation is performed using an ADER discontinuous Galerkin discretization on an unstructured tetrahedral mesh with the software SeisSol. Advantages of this approach include accurate representation of complex fault and sea floor geometries and a parallelized and efficient workflow in high-performance computing environments. Accurate and efficient representation of the tsunami evolution and inundation at the coast is achieved with an adaptive mesh discretizing the shallow water equations with a second-order Runge-Kutta discontinuous Galerkin (RKDG) scheme. With the application of the framework to this historic event, we aim to better understand the involved mechanisms between the dynamic earthquake within the earth's crust, the resulting tsunami wave within the ocean, and the final coastal inundation process. Earthquake model results are constrained by GPS surface displacements and tsunami model results are compared with buoy and inundation data. This research is part of the ASCETE Project, "Advanced Simulation of Coupled Earthquake and Tsunami Events", funded by the Volkswagen Foundation.
Simulation-Based Valuation of Transactive Energy Systems
Huang, Qiuhua; McDermott, Tom; Tang, Yingying; ...
2018-05-18
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
Simulation-Based Valuation of Transactive Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Qiuhua; McDermott, Tom; Tang, Yingying
Transactive Energy (TE) has been recognized as a promising technique for integrating responsive loads and distributed energy resources as well as advancing grid modernization. To help the industry better understand the value of TE and compare different TE schemes in a systematic and transparent manner, a comprehensive simulation-based TE valuation method is developed. The method has the following salient features: 1) it formally defines the valuation scenarios, use cases, baseline and valuation metrics; 2) an open-source simulation platform for transactive energy systems has been developed by integrating transmission, distribution and building simulators, and plugin TE and non-TE agents through themore » Framework for Network Co-Simulation (FNCS); 3) transparency and flexibility of the valuation is enhanced through separation of simulation and valuation, base valuation metrics and final valuation metrics. In conclusion, a valuation example based on the Smart Grid Interoperability Panel (SGIP) Use Case 1 is provided to demonstrate the developed TE simulation program and the valuation method.« less
NASA Astrophysics Data System (ADS)
Dolan, B.; Rutledge, S. A.; Barnum, J. I.; Matsui, T.; Tao, W. K.; Iguchi, T.
2017-12-01
POLarimetric Radar Retrieval and Instrument Simulator (POLARRIS) is a framework that has been developed to simulate radar observations from cloud resolving model (CRM) output and subject model data and observations to the same retrievals, analysis and visualization. This framework not only enables validation of bulk microphysical model simulated properties, but also offers an opportunity to study the uncertainties associated with retrievals such as hydrometeor classification (HID). For the CSU HID, membership beta functions (MBFs) are built using a set of simulations with realistic microphysical assumptions about axis ratio, density, canting angles, size distributions for each of ten hydrometeor species. These assumptions are tested using POLARRIS to understand their influence on the resulting simulated polarimetric data and final HID classification. Several of these parameters (density, size distributions) are set by the model microphysics, and therefore the specific assumptions of axis ratio and canting angle are carefully studied. Through these sensitivity studies, we hope to be able to provide uncertainties in retrieved polarimetric variables and HID as applied to CRM output. HID retrievals assign a classification to each point by determining the highest score, thereby identifying the dominant hydrometeor type within a volume. However, in nature, there is rarely just one a single hydrometeor type at a particular point. Models allow for mixing ratios of different hydrometeors within a grid point. We use the mixing ratios from CRM output in concert with the HID scores and classifications to understand how the HID algorithm can provide information about mixtures within a volume, as well as calculate a confidence in the classifications. We leverage the POLARRIS framework to additionally probe radar wavelength differences toward the possibility of a multi-wavelength HID which could utilize the strengths of different wavelengths to improve HID classifications. With these uncertainties and algorithm improvements, cases of convection are studied in a continental (Oklahoma) and maritime (Darwin, Australia) regime. Observations from C-band polarimetric data in both locations are compared to CRM simulations from NU-WRF using the POLARRIS framework.
Theoretical foundations of learning through simulation.
Zigmont, Jason J; Kappus, Liana J; Sudikoff, Stephanie N
2011-04-01
Health care simulation is a powerful educational tool to help facilitate learning for clinicians and change their practice to improve patient outcomes and safety. To promote effective life-long learning through simulation, the educator needs to consider individuals, their experiences, and their environments. Effective education of adults through simulation requires a sound understanding of both adult learning theory and experiential learning. This review article provides a framework for developing and facilitating simulation courses, founded upon empiric and theoretic research in adult and experiential learning. Specifically, this article provides a theoretic foundation for using simulation to change practice to improve patient outcomes and safety. Copyright © 2011 Elsevier Inc. All rights reserved.
Martin, Caitlin
2014-01-01
One of the major failure modes of bioprosthetic heart valves (BHVs) is noncalcific structural deterioration due to fatigue of the tissue leaflets; yet, the mechanisms of fatigue are not well understood. BHV durability is primarily assessed based on visual inspection of the leaflets following accelerated wear testing. In this study, we developed a computational framework to simulate BHV leaflet fatigue, which is both efficient and quantitative, making it an attractive alternative to traditional accelerated wear testing. We utilize a phenomenological soft tissue fatigue damage model developed previously to describe the stress softening and permanent set of the glutaraldehyde-treated bovine pericardium leaflets in BHVs subjected to cyclic loading. A parametric study was conducted to determine the effects of altered leaflet and stent elastic properties on the fatigue of the leaflets. The simulation results show that heterogeneity of the leaflet elastic properties, poor leaflet coaptation, and little stent-tip deflection may accelerate leaflet fatigue, which agrees with clinical findings. Therefore, the developed framework may be an invaluable tool for evaluating leaflet durability in new tissue valve designs, including traditional BHVs as well as new transcatheter valves. PMID:24092257
Computational and experimental investigation of free vibration and flutter of bridge decks
NASA Astrophysics Data System (ADS)
Helgedagsrud, Tore A.; Bazilevs, Yuri; Mathisen, Kjell M.; Øiseth, Ole A.
2018-06-01
A modified rigid-object formulation is developed, and employed as part of the fluid-object interaction modeling framework from Akkerman et al. (J Appl Mech 79(1):010905, 2012. https://doi.org/10.1115/1.4005072) to simulate free vibration and flutter of long-span bridges subjected to strong winds. To validate the numerical methodology, companion wind tunnel experiments have been conducted. The results show that the computational framework captures very precisely the aeroelastic behavior in terms of aerodynamic stiffness, damping and flutter characteristics. Considering its relative simplicity and accuracy, we conclude from our study that the proposed free-vibration simulation technique is a valuable tool in engineering design of long-span bridges.
Design and performance frameworks for constructing problem-solving simulations.
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement.
Design and Performance Frameworks for Constructing Problem-Solving Simulations
Stevens, Ron; Palacio-Cayetano, Joycelin
2003-01-01
Rapid advancements in hardware, software, and connectivity are helping to shorten the times needed to develop computer simulations for science education. These advancements, however, have not been accompanied by corresponding theories of how best to design and use these technologies for teaching, learning, and testing. Such design frameworks ideally would be guided less by the strengths/limitations of the presentation media and more by cognitive analyses detailing the goals of the tasks, the needs and abilities of students, and the resulting decision outcomes needed by different audiences. This article describes a problem-solving environment and associated theoretical framework for investigating how students select and use strategies as they solve complex science problems. A framework is first described for designing on-line problem spaces that highlights issues of content, scale, cognitive complexity, and constraints. While this framework was originally designed for medical education, it has proven robust and has been successfully applied to learning environments from elementary school through medical school. Next, a similar framework is detailed for collecting student performance and progress data that can provide evidence of students' strategic thinking and that could potentially be used to accelerate student progress. Finally, experimental validation data are presented that link strategy selection and use with other metrics of scientific reasoning and student achievement. PMID:14506505
An analytical framework to assist decision makers in the use of forest ecosystem model predictions
USDA-ARS?s Scientific Manuscript database
The predictions of most terrestrial ecosystem models originate from deterministic simulations. Relatively few uncertainty evaluation exercises in model outputs are performed by either model developers or users. This issue has important consequences for decision makers who rely on models to develop n...
A Hardware-Accelerated Quantum Monte Carlo framework (HAQMC) for N-body systems
NASA Astrophysics Data System (ADS)
Gothandaraman, Akila; Peterson, Gregory D.; Warren, G. Lee; Hinde, Robert J.; Harrison, Robert J.
2009-12-01
Interest in the study of structural and energetic properties of highly quantum clusters, such as inert gas clusters has motivated the development of a hardware-accelerated framework for Quantum Monte Carlo simulations. In the Quantum Monte Carlo method, the properties of a system of atoms, such as the ground-state energies, are averaged over a number of iterations. Our framework is aimed at accelerating the computations in each iteration of the QMC application by offloading the calculation of properties, namely energy and trial wave function, onto reconfigurable hardware. This gives a user the capability to run simulations for a large number of iterations, thereby reducing the statistical uncertainty in the properties, and for larger clusters. This framework is designed to run on the Cray XD1 high performance reconfigurable computing platform, which exploits the coarse-grained parallelism of the processor along with the fine-grained parallelism of the reconfigurable computing devices available in the form of field-programmable gate arrays. In this paper, we illustrate the functioning of the framework, which can be used to calculate the energies for a model cluster of helium atoms. In addition, we present the capabilities of the framework that allow the user to vary the chemical identities of the simulated atoms. Program summaryProgram title: Hardware Accelerated Quantum Monte Carlo (HAQMC) Catalogue identifier: AEEP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 691 537 No. of bytes in distributed program, including test data, etc.: 5 031 226 Distribution format: tar.gz Programming language: C/C++ for the QMC application, VHDL and Xilinx 8.1 ISE/EDK tools for FPGA design and development Computer: Cray XD1 consisting of a dual-core, dualprocessor AMD Opteron 2.2 GHz with a Xilinx Virtex-4 (V4LX160) or Xilinx Virtex-II Pro (XC2VP50) FPGA per node. We use the compute node with the Xilinx Virtex-4 FPGA Operating system: Red Hat Enterprise Linux OS Has the code been vectorised or parallelized?: Yes Classification: 6.1 Nature of problem: Quantum Monte Carlo is a practical method to solve the Schrödinger equation for large many-body systems and obtain the ground-state properties of such systems. This method involves the sampling of a number of configurations of atoms and averaging the properties of the configurations over a number of iterations. We are interested in applying the QMC method to obtain the energy and other properties of highly quantum clusters, such as inert gas clusters. Solution method: The proposed framework provides a combined hardware-software approach, in which the QMC simulation is performed on the host processor, with the computationally intensive functions such as energy and trial wave function computations mapped onto the field-programmable gate array (FPGA) logic device attached as a co-processor to the host processor. We perform the QMC simulation for a number of iterations as in the case of our original software QMC approach, to reduce the statistical uncertainty of the results. However, our proposed HAQMC framework accelerates each iteration of the simulation, by significantly reducing the time taken to calculate the ground-state properties of the configurations of atoms, thereby accelerating the overall QMC simulation. We provide a generic interpolation framework that can be extended to study a variety of pure and doped atomic clusters, irrespective of the chemical identities of the atoms. For the FPGA implementation of the properties, we use a two-region approach for accurately computing the properties over the entire domain, employ deep pipelines and fixed-point for all our calculations guaranteeing the accuracy required for our simulation.
Zevin, Boris; Levy, Jeffrey S; Satava, Richard M; Grantcharov, Teodor P
2012-10-01
Simulation-based training can improve technical and nontechnical skills in surgery. To date, there is no consensus on the principles for design, validation, and implementation of a simulation-based surgical training curriculum. The aim of this study was to define such principles and formulate them into an interoperable framework using international expert consensus based on the Delphi method. Literature was reviewed, 4 international experts were queried, and consensus conference of national and international members of surgical societies was held to identify the items for the Delphi survey. Forty-five international experts in surgical education were invited to complete the online survey by ranking each item on a Likert scale from 1 to 5. Consensus was predefined as Cronbach's α ≥0.80. Items that 80% of experts ranked as ≥4 were included in the final framework. Twenty-four international experts with training in general surgery (n = 11), orthopaedic surgery (n = 2), obstetrics and gynecology (n = 3), urology (n = 1), plastic surgery (n = 1), pediatric surgery (n = 1), otolaryngology (n = 1), vascular surgery (n = 1), military (n = 1), and doctorate-level educators (n = 2) completed the iterative online Delphi survey. Consensus among participants was achieved after one round of the survey (Cronbach's α = 0.91). The final framework included predevelopment analysis; cognitive, psychomotor, and team-based training; curriculum validation evaluation and improvement; and maintenance of training. The Delphi methodology allowed for determination of international expert consensus on the principles for design, validation, and implementation of a simulation-based surgical training curriculum. These principles were formulated into a framework that can be used internationally across surgical specialties as a step-by-step guide for the development and validation of future simulation-based training curricula. Copyright © 2012 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Varadharajan, C.; Detto, M.; Faybishenko, B.; Gimenez, B.; Jardine, K.; Negron Juarez, R. I.; Pastorello, G.; Powell, T.; Warren, J.; Wolfe, B.; McDowell, N. G.; Kueppers, L. M.; Chambers, J.; Agarwal, D.
2016-12-01
The U.S. Department of Energy's (DOE) Next Generation Ecosystem Experiment (NGEE) Tropics project aims to develop a process-rich tropical forest ecosystem model that is parameterized and benchmarked by field observations. Thus, data synthesis, quality assurance and quality control (QA/QC), and data product generation of a diverse and complex set of ecohydrological observations, including sapflux, leaf surface temperature, soil water content, and leaf gas exchange from sites across the Tropics, are required to support model simulations. We have developed a metadata reporting framework, implemented in conjunction with the NGEE Tropics Data Archive tool, to enable cross-site and cross-method comparison, data interpretability, and QA/QC. We employed a modified User-Centered Design approach, which involved short development cycles based on user-identified needs, and iterative testing with data providers and users. The metadata reporting framework currently has been implemented for sensor-based observations and leverages several existing metadata protocols. The framework consists of templates that define a multi-scale measurement position hierarchy, descriptions of measurement settings, and details about data collection and data file organization. The framework also enables data providers to define data-access permission settings, provenance, and referencing to enable appropriate data usage, citation, and attribution. In addition to describing the metadata reporting framework, we discuss tradeoffs and impressions from both data providers and users during the development process, focusing on the scalability, usability, and efficiency of the framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey C. Joe; Diego Mandelli; Ronald L. Boring
2015-07-01
The United States Department of Energy is sponsoring the Light Water Reactor Sustainability program, which has the overall objective of supporting the near-term and the extended operation of commercial nuclear power plants. One key research and development (R&D) area in this program is the Risk-Informed Safety Margin Characterization pathway, which combines probabilistic risk simulation with thermohydraulic simulation codes to define and manage safety margins. The R&D efforts to date, however, have not included robust simulations of human operators, and how the reliability of human performance or lack thereof (i.e., human errors) can affect risk-margins and plant performance. This paper describesmore » current and planned research efforts to address the absence of robust human reliability simulations and thereby increase the fidelity of simulated accident scenarios.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Jin, Shuangshuang; Chen, Yousu
This paper presents a faster-than-real-time dynamic simulation software package that is designed for large-size power system dynamic simulation. It was developed on the GridPACKTM high-performance computing (HPC) framework. The key features of the developed software package include (1) faster-than-real-time dynamic simulation for a WECC system (17,000 buses) with different types of detailed generator, controller, and relay dynamic models, (2) a decoupled parallel dynamic simulation algorithm with optimized computation architecture to better leverage HPC resources and technologies, (3) options for HPC-based linear and iterative solvers, (4) hidden HPC details, such as data communication and distribution, to enable development centered on mathematicalmore » models and algorithms rather than on computational details for power system researchers, and (5) easy integration of new dynamic models and related algorithms into the software package.« less
Sookhak Lari, Kaveh; Johnston, Colin D; Rayner, John L; Davis, Greg B
2018-03-05
Remediation of subsurface systems, including groundwater, soil and soil gas, contaminated with light non-aqueous phase liquids (LNAPLs) is challenging. Field-scale pilot trials of multi-phase remediation were undertaken at a site to determine the effectiveness of recovery options. Sequential LNAPL skimming and vacuum-enhanced skimming, with and without water table drawdown were trialled over 78days; in total extracting over 5m 3 of LNAPL. For the first time, a multi-component simulation framework (including the multi-phase multi-component code TMVOC-MP and processing codes) was developed and applied to simulate the broad range of multi-phase remediation and recovery methods used in the field trials. This framework was validated against the sequential pilot trials by comparing predicted and measured LNAPL mass removal rates and compositional changes. The framework was tested on both a Cray supercomputer and a cluster. Simulations mimicked trends in LNAPL recovery rates (from 0.14 to 3mL/s) across all remediation techniques each operating over periods of 4-14days over the 78day trial. The code also approximated order of magnitude compositional changes of hazardous chemical concentrations in extracted gas during vacuum-enhanced recovery. The verified framework enables longer term prediction of the effectiveness of remediation approaches allowing better determination of remediation endpoints and long-term risks. Copyright © 2017 Commonwealth Scientific and Industrial Research Organisation. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.
Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less
Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; ...
2016-04-25
Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendenhall, M.; Bowden, N.; Brodsky, J.
Electron anti-neutrino ( e) detectors can support nuclear safeguards, from reactor monitoring to spent fuel characterization. In recent years, the scientific community has developed multiple detector concepts, many of which have been prototyped or deployed for specific measurements by their respective collaborations. However, the diversity of technical approaches, deployment conditions, and analysis techniques complicates direct performance comparison between designs. We have begun development of a simulation framework to compare and evaluate existing and proposed detector designs for nonproliferation applications in a uniform manner. This report demonstrates the intent and capabilities of the framework by evaluating four detector design concepts, calculatingmore » generic reactor antineutrino counting sensitivity, and capabilities in a plutonium disposition application example.« less
Argonne simulation framework for intelligent transportation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewing, T.; Doss, E.; Hanebutte, U.
1996-04-01
A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically tomore » reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.« less
Fundamentals of neurosurgery: virtual reality tasks for training and evaluation of technical skills.
Choudhury, Nusrat; Gélinas-Phaneuf, Nicholas; Delorme, Sébastien; Del Maestro, Rolando
2013-11-01
Technical skills training in neurosurgery is mostly done in the operating room. New educational paradigms are encouraging the development of novel training methods for surgical skills. Simulation could answer some of these needs. This article presents the development of a conceptual training framework for use on a virtual reality neurosurgical simulator. Appropriate tasks were identified by reviewing neurosurgical oncology curricula requirements and performing cognitive task analyses of basic techniques and representative surgeries. The tasks were then elaborated into training modules by including learning objectives, instructions, levels of difficulty, and performance metrics. Surveys and interviews were iteratively conducted with subject matter experts to delimitate, review, discuss, and approve each of the development stages. Five tasks were selected as representative of basic and advanced neurosurgical skill. These tasks were: 1) ventriculostomy, 2) endoscopic nasal navigation, 3) tumor debulking, 4) hemostasis, and 5) microdissection. The complete training modules were structured into easy, intermediate, and advanced settings. Performance metrics were also integrated to provide feedback on outcome, efficiency, and errors. The subject matter experts deemed the proposed modules as pertinent and useful for neurosurgical skills training. The conceptual framework presented here, the Fundamentals of Neurosurgery, represents a first attempt to develop standardized training modules for technical skills acquisition in neurosurgical oncology. The National Research Council Canada is currently developing NeuroTouch, a virtual reality simulator for cranial microneurosurgery. The simulator presently includes the five Fundamentals of Neurosurgery modules at varying stages of completion. A first pilot study has shown that neurosurgical residents obtained higher performance scores on the simulator than medical students. Further work will validate its components and use in a training curriculum. Copyright © 2013 N. Choudhury. Published by Elsevier Inc. All rights reserved.
C/sup 3/ and combat simulation - a survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, S.A. Jr.
1983-01-04
This article looks at the overlap between C/sup 3/ and combat simulation, from the point of view of the developer of combat simulations and models. In this context, there are two different questions. The first is: How and to what extent should specific models of the C/sup 3/ processes be incorporated in simulations of combat. Here the key point is the assessment of impact. In which types or levels of combat does C/sup 3/ play a role sufficiently intricate and closely coupled with combat performance that it would significantly affect combat results. Conversely, when is C/sup 3/ a known factormore » or modifier which can be simply accommodated without a specific detailed model being made for it. The second question is the inverse one. In the development of future C/sup 3/ systems, what rule should combat simulation play. Obviously, simulation of the operation of the hardware, software and other parts of the C/sup 3/ system would be useful in its design and specification, but this is not combat simulation. When is it necessary to encase the C/sup 3/ simulation model in a combat model which has enough detail to be considered a simulation itself. How should this outer combat model be scoped out as to the components needed. In order to build a background for answering these questions a two-pronged approach will be taken. First a framework for C/sup 3/ modeling will be developed, in which the various types of modeling which can be done to include or encase C/sup 3/ in a combat model are organized. This framework will hopefully be useful in describing the particular assumptions made in specific models in terms of what could be done in a more general way. Then a few specific models will be described, concentrating on the C/sup 3/ portion of the simulations, or what could be interpreted as the C/sup 3/ assumptions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Nina; Ko, Teresa; Shneider, Max
Seldon is an agent-based social simulation framework that uniquely integrates concepts from a variety of different research areas including psychology, social science, and agent-based modeling. Development has been taking place for a number of years, previously focusing on gang and terrorist recruitment. The toolkit consists of simple agents (individuals) and abstract agents (groups of individuals representing social/institutional concepts) that interact according to exchangeable rule sets (i.e. linear attraction, linear reinforcement). Each agent has a set of customizable attributes that get modified during the interactions. Interactions create relationships between agents, and each agent has a maximum amount of relationship energy thatmore » it can expend. As relationships evolve, they form multiple levels of social networks (i.e. acquaintances, friends, cliques) that in turn drive future interactions. Agents can also interact randomly if they are not connected through a network, mimicking the chance interactions that real people have in everyday life. We are currently integrating Seldon with the cognitive framework (also developed at Sandia). Each individual agent has a lightweight cognitive model that is created automatically from textual sources. Cognitive information is exchanged during interactions, and can also be injected into a running simulation. The entire framework has been parallelized to allow for larger simulations in an HPC environment. We have also added more detail to the agents themselves (a"Big Five" personality model) and their interactions (an enhanced relationship model) for a more realistic representation.« less
WavePropaGator: interactive framework for X-ray free-electron laser optics design and simulations.
Samoylova, Liubov; Buzmakov, Alexey; Chubar, Oleg; Sinn, Harald
2016-08-01
This article describes the WavePropaGator ( WPG ) package, a new interactive software framework for coherent and partially coherent X-ray wavefront propagation simulations. The package has been developed at European XFEL for users at the existing and emerging free-electron laser (FEL) facilities, as well as at the third-generation synchrotron sources and future diffraction-limited storage rings. The WPG addresses the needs of beamline scientists and user groups to facilitate the design, optimization and improvement of X-ray optics to meet their experimental requirements. The package uses the Synchrotron Radiation Workshop ( SRW ) C/C++ library and its Python binding for numerical wavefront propagation simulations. The framework runs reliably under Linux, Microsoft Windows 7 and Apple Mac OS X and is distributed under an open-source license. The available tools allow for varying source parameters and optics layouts and visualizing the results interactively. The wavefront history structure can be used for tracking changes in every particular wavefront during propagation. The batch propagation mode enables processing of multiple wavefronts in workflow mode. The paper presents a general description of the package and gives some recent application examples, including modeling of full X-ray FEL beamlines and start-to-end simulation of experiments.
Particle Number Dependence of the N-body Simulations of Moon Formation
NASA Astrophysics Data System (ADS)
Sasaki, Takanori; Hosono, Natsuki
2018-04-01
The formation of the Moon from the circumterrestrial disk has been investigated by using N-body simulations with the number N of particles limited from 104 to 105. We develop an N-body simulation code on multiple Pezy-SC processors and deploy Framework for Developing Particle Simulators to deal with large number of particles. We execute several high- and extra-high-resolution N-body simulations of lunar accretion from a circumterrestrial disk of debris generated by a giant impact on Earth. The number of particles is up to 107, in which 1 particle corresponds to a 10 km sized satellitesimal. We find that the spiral structures inside the Roche limit radius differ between low-resolution simulations (N ≤ 105) and high-resolution simulations (N ≥ 106). According to this difference, angular momentum fluxes, which determine the accretion timescale of the Moon also depend on the numerical resolution.
Ca-Pri a Cellular Automata Phenomenological Research Investigation: Simulation Results
NASA Astrophysics Data System (ADS)
Iannone, G.; Troisi, A.
2013-05-01
Following the introduction of a phenomenological cellular automata (CA) model capable to reproduce city growth and urban sprawl, we develop a toy model simulation considering a realistic framework. The main characteristic of our approach is an evolution algorithm based on inhabitants preferences. The control of grown cells is obtained by means of suitable functions which depend on the initial condition of the simulation. New born urban settlements are achieved by means of a logistic evolution of the urban pattern while urban sprawl is controlled by means of the population evolution function. In order to compare model results with a realistic urban framework we have considered, as the area of study, the island of Capri (Italy) in the Mediterranean Sea. Two different phases of the urban evolution on the island have been taken into account: a new born initial growth as induced by geographic suitability and the simulation of urban spread after 1943 induced by the population evolution after this date.
FACE-IT. A Science Gateway for Food Security Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montella, Raffaele; Kelly, David; Xiong, Wei
Progress in sustainability science is hindered by challenges in creating and managing complex data acquisition, processing, simulation, post-processing, and intercomparison pipelines. To address these challenges, we developed the Framework to Advance Climate, Economic, and Impact Investigations with Information Technology (FACE-IT) for crop and climate impact assessments. This integrated data processing and simulation framework enables data ingest from geospatial archives; data regridding, aggregation, and other processing prior to simulation; large-scale climate impact simulations with agricultural and other models, leveraging high-performance and cloud computing; and post-processing to produce aggregated yields and ensemble variables needed for statistics, for model intercomparison, and to connectmore » biophysical models to global and regional economic models. FACE-IT leverages the capabilities of the Globus Galaxies platform to enable the capture of workflows and outputs in well-defined, reusable, and comparable forms. We describe FACE-IT and applications within the Agricultural Model Intercomparison and Improvement Project and the Center for Robust Decision-making on Climate and Energy Policy.« less
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.
2017-12-01
Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.
2009-01-01
controllers (currently using the Robostix+Gumstix pair ). The interface between the plant simulator and the controller is ‘hard real-time’, and the xPC box... simulation ) on aerobatic maneuver design for the STARMAC quadrotor helicopter testbed. In related work, we have developed a new optimization scheme...for scheduling hybrid systems, and have demonstrated the results on an autonomous car simulation testbed. We are focusing efforts this summer for
A stochastic hybrid systems based framework for modeling dependent failure processes
Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying
2017-01-01
In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods. PMID:28231313
A stochastic hybrid systems based framework for modeling dependent failure processes.
Fan, Mengfei; Zeng, Zhiguo; Zio, Enrico; Kang, Rui; Chen, Ying
2017-01-01
In this paper, we develop a framework to model and analyze systems that are subject to dependent, competing degradation processes and random shocks. The degradation processes are described by stochastic differential equations, whereas transitions between the system discrete states are triggered by random shocks. The modeling is, then, based on Stochastic Hybrid Systems (SHS), whose state space is comprised of a continuous state determined by stochastic differential equations and a discrete state driven by stochastic transitions and reset maps. A set of differential equations are derived to characterize the conditional moments of the state variables. System reliability and its lower bounds are estimated from these conditional moments, using the First Order Second Moment (FOSM) method and Markov inequality, respectively. The developed framework is applied to model three dependent failure processes from literature and a comparison is made to Monte Carlo simulations. The results demonstrate that the developed framework is able to yield an accurate estimation of reliability with less computational costs compared to traditional Monte Carlo-based methods.
Caccavale, Justin; Fiumara, David; Stapf, Michael; Sweitzer, Liedeke; Anderson, Hannah J; Gorky, Jonathan; Dhurjati, Prasad; Galileo, Deni S
2017-12-11
Glioblastoma multiforme (GBM) is a devastating brain cancer for which there is no known cure. Its malignancy is due to rapid cell division along with high motility and invasiveness of cells into the brain tissue. Simple 2-dimensional laboratory assays (e.g., a scratch assay) commonly are used to measure the effects of various experimental perturbations, such as treatment with chemical inhibitors. Several mathematical models have been developed to aid the understanding of the motile behavior and proliferation of GBM cells. However, many are mathematically complicated, look at multiple interdependent phenomena, and/or use modeling software not freely available to the research community. These attributes make the adoption of models and simulations of even simple 2-dimensional cell behavior an uncommon practice by cancer cell biologists. Herein, we developed an accurate, yet simple, rule-based modeling framework to describe the in vitro behavior of GBM cells that are stimulated by the L1CAM protein using freely available NetLogo software. In our model L1CAM is released by cells to act through two cell surface receptors and a point of signaling convergence to increase cell motility and proliferation. A simple graphical interface is provided so that changes can be made easily to several parameters controlling cell behavior, and behavior of the cells is viewed both pictorially and with dedicated graphs. We fully describe the hierarchical rule-based modeling framework, show simulation results under several settings, describe the accuracy compared to experimental data, and discuss the potential usefulness for predicting future experimental outcomes and for use as a teaching tool for cell biology students. It is concluded that this simple modeling framework and its simulations accurately reflect much of the GBM cell motility behavior observed experimentally in vitro in the laboratory. Our framework can be modified easily to suit the needs of investigators interested in other similar intrinsic or extrinsic stimuli that influence cancer or other cell behavior. This modeling framework of a commonly used experimental motility assay (scratch assay) should be useful to both researchers of cell motility and students in a cell biology teaching laboratory.
NASA Astrophysics Data System (ADS)
Varghese, Julian
This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.
Ong, Carmichael F; Hicks, Jennifer L; Delp, Scott L
2016-05-01
Technologies that augment human performance are the focus of intensive research and development, driven by advances in wearable robotic systems. Success has been limited by the challenge of understanding human-robot interaction. To address this challenge, we developed an optimization framework to synthesize a realistic human standing long jump and used the framework to explore how simulated wearable robotic devices might enhance jump performance. A planar, five-segment, seven-degree-of-freedom model with physiological torque actuators, which have variable torque capacity depending on joint position and velocity, was used to represent human musculoskeletal dynamics. An active augmentation device was modeled as a torque actuator that could apply a single pulse of up to 100 Nm of extension torque. A passive design was modeled as rotational springs about each lower limb joint. Dynamic optimization searched for physiological and device actuation patterns to maximize jump distance. Optimization of the nominal case yielded a 2.27 m jump that captured salient kinematic and kinetic features of human jumps. When the active device was added to the ankle, knee, or hip, jump distance increased to between 2.49 and 2.52 m. Active augmentation of all three joints increased the jump distance to 3.10 m. The passive design increased jump distance to 3.32 m by adding torques of 135, 365, and 297 Nm to the ankle, knee, and hip, respectively. Dynamic optimization can be used to simulate a standing long jump and investigate human-robot interaction. Simulation can aid in the design of performance-enhancing technologies.
A multi-model framework for simulating wildlife population response to land-use and climate change
McRae, B.H.; Schumaker, N.H.; McKane, R.B.; Busing, R.T.; Solomon, A.M.; Burdick, C.A.
2008-01-01
Reliable assessments of how human activities will affect wildlife populations are essential for making scientifically defensible resource management decisions. A principle challenge of predicting effects of proposed management, development, or conservation actions is the need to incorporate multiple biotic and abiotic factors, including land-use and climate change, that interact to affect wildlife habitat and populations through time. Here we demonstrate how models of land-use, climate change, and other dynamic factors can be integrated into a coherent framework for predicting wildlife population trends. Our framework starts with land-use and climate change models developed for a region of interest. Vegetation changes through time under alternative future scenarios are predicted using an individual-based plant community model. These predictions are combined with spatially explicit animal habitat models to map changes in the distribution and quality of wildlife habitat expected under the various scenarios. Animal population responses to habitat changes and other factors are then projected using a flexible, individual-based animal population model. As an example application, we simulated animal population trends under three future land-use scenarios and four climate change scenarios in the Cascade Range of western Oregon. We chose two birds with contrasting habitat preferences for our simulations: winter wrens (Troglodytes troglodytes), which are most abundant in mature conifer forests, and song sparrows (Melospiza melodia), which prefer more open, shrubby habitats. We used climate and land-use predictions from previously published studies, as well as previously published predictions of vegetation responses using FORCLIM, an individual-based forest dynamics simulator. Vegetation predictions were integrated with other factors in PATCH, a spatially explicit, individual-based animal population simulator. Through incorporating effects of landscape history and limited dispersal, our framework predicted population changes that typically exceeded those expected based on changes in mean habitat suitability alone. Although land-use had greater impacts on habitat quality than did climate change in our simulations, we found that small changes in vital rates resulting from climate change or other stressors can have large consequences for population trajectories. The ability to integrate bottom-up demographic processes like these with top-down constraints imposed by climate and land-use in a dynamic modeling environment is a key advantage of our approach. The resulting framework should allow researchers to synthesize existing empirical evidence, and to explore complex interactions that are difficult or impossible to capture through piecemeal modeling approaches. ?? 2008 Elsevier B.V.
Numerical propulsion system simulation
NASA Technical Reports Server (NTRS)
Lytle, John K.; Remaklus, David A.; Nichols, Lester D.
1990-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributors to the high cost is the need to perform many large scale system tests. Extensive testing is used to capture the complex interactions among the multiple disciplines and the multiple components inherent in complex systems. The objective of the Numerical Propulsion System Simulation (NPSS) is to provide insight into these complex interactions through computational simulations. This will allow for comprehensive evaluation of new concepts early in the design phase before a commitment to hardware is made. It will also allow for rapid assessment of field-related problems, particularly in cases where operational problems were encountered during conditions that would be difficult to simulate experimentally. The tremendous progress taking place in computational engineering and the rapid increase in computing power expected through parallel processing make this concept feasible within the near future. However it is critical that the framework for such simulations be put in place now to serve as a focal point for the continued developments in computational engineering and computing hardware and software. The NPSS concept which is described will provide that framework.
NASA Astrophysics Data System (ADS)
Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian
2018-01-01
We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.
PROTO-PLASM: parallel language for adaptive and scalable modelling of biosystems.
Bajaj, Chandrajit; DiCarlo, Antonio; Paoluzzi, Alberto
2008-09-13
This paper discusses the design goals and the first developments of PROTO-PLASM, a novel computational environment to produce libraries of executable, combinable and customizable computer models of natural and synthetic biosystems, aiming to provide a supporting framework for predictive understanding of structure and behaviour through multiscale geometric modelling and multiphysics simulations. Admittedly, the PROTO-PLASM platform is still in its infancy. Its computational framework--language, model library, integrated development environment and parallel engine--intends to provide patient-specific computational modelling and simulation of organs and biosystem, exploiting novel functionalities resulting from the symbolic combination of parametrized models of parts at various scales. PROTO-PLASM may define the model equations, but it is currently focused on the symbolic description of model geometry and on the parallel support of simulations. Conversely, CellML and SBML could be viewed as defining the behavioural functions (the model equations) to be used within a PROTO-PLASM program. Here we exemplify the basic functionalities of PROTO-PLASM, by constructing a schematic heart model. We also discuss multiscale issues with reference to the geometric and physical modelling of neuromuscular junctions.
Modeling ECCD/MHD coupling using NIMROD, GENRAY, and the Integrated Plasma Simulator
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Schnack, D. D.; Sovinec, C. R.; Hegna, C. C.; Callen, J. D.; Ebrahimi, F.; Kruger, S. E.; Carlsson, J.; Held, E. D.; Ji, J.-Y.; Harvey, R. W.; Smirnov, A. P.; Elwasif, W. R.
2009-11-01
We summarize ongoing theoretical/numerical work relevant to the development of a self--consistent framework for the inclusion of RF effects in fluid simulations; specifically, we consider the stabilization of resistive tearing modes in tokamak geometry by electron cyclotron current drive. In the fluid equations, ad hoc models for the RF--induced currents have previously been shown to shrink or altogether suppress the nonlinearly saturated magnetic islands generated by tearing modes; progress toward a self--consistent model is reported. The interfacing of the NIMROD [1] code with the GENRAY/CQL3D [2] codes (which calculate RF propagation and energy/momentum deposition) via the Integrated Plasma Simulator (IPS) framework [3] is explained, RF-induced rational surface motion and the equilibration of RF--induced currents over plasma flux surfaces are investigated, and the efficient reduction of saturated island widths through time modulation and spatial localization of the ECCD is explored. [1] Sovinec et al., JCP 195, 355 (2004) [2]www.compxco.com [3] Both the IPS development and the research presented here are part of the SWIM project. Funded by U.S. DoE.
Modeling of RF/MHD coupling using NIMROD, GENRAY, and the Integrated Plasma Simulator
NASA Astrophysics Data System (ADS)
Jenkins, Thomas; Schnack, D. D.; Sovinec, C. R.; Hegna, C. C.; Callen, J. D.; Ebrahimi, F.; Kruger, S. E.; Carlsson, J.; Held, E. D.; Ji, J.-Y.; Harvey, R. W.; Smirnov, A. P.
2009-05-01
We summarize ongoing theoretical/numerical work relevant to the development of a self--consistent framework for the inclusion of RF effects in fluid simulations; specifically considering resistive tearing mode stabilization in tokamak (DIII--D--like) geometry via ECCD. Relatively simple (though non--self--consistent) models for the RF--induced currents are incorporated into the fluid equations, markedly reducing the width of the nonlinearly saturated magnetic islands generated by tearing modes. We report our progress toward the self--consistent modeling of these RF--induced currents. The initial interfacing of the NIMROD* code with the GENRAY/CQL3D** codes (calculating RF propagation and energy/momentum deposition) via the Integrated Plasma Simulator (IPS) framework*** is explained, equilibration of RF--induced currents over the plasma flux surfaces is investigated, and studies exploring the efficient reduction of saturated island widths through time modulation and spatial localization of the ECCD are presented. *[Sovinec et al., JCP 195, 355 (2004)] **[www.compxco.com] ***[This research and the IPS development are both part of the SWIM project. Funded by U.S. DoE.
M3MS-16OR0401086 – Report on NEAMS Workbench Support for MOOSE Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lefebvre, Robert A.; Langley, Brandon R.; Thompson, Adam B.
This report summarizes the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Workbench from Oak Ridge National Laboratory (ORNL) and the integration of the MOOSE framework. This report marks the completion of NEAMS milestone M3MS-16OR0401086. This report documents the developed infrastructure to support the MOOSE framework applications, the applications’ results, visualization status, the collaboration that facilitated this progress, and future considerations.
"Nuclear Deterrence" as an Adaptive Game Frame for Crisis Decision-Making.
ERIC Educational Resources Information Center
Sorenson, David S.
1981-01-01
Describes the simulation game "Nuclear Deterrence," which was developed to model an international relations crisis situation involving a bargaining framework potentially applicable to crisis modeling in other disciplines. Eight references are listed. (Author/LLS)
NASA Technical Reports Server (NTRS)
Rudraraju, Siva Shankar; Garikipati, Krishna; Waas, Anthony M.; Bednarcyk, Brett A.
2013-01-01
The phenomenon of crack propagation is among the predominant modes of failure in many natural and engineering structures, often leading to severe loss of structural integrity and catastrophic failure. Thus, the ability to understand and a priori simulate the evolution of this failure mode has been one of the cornerstones of applied mechanics and structural engineering and is broadly referred to as "fracture mechanics." The work reported herein focuses on extending this understanding, in the context of through-thickness crack propagation in cohesive materials, through the development of a continuum-level multiscale numerical framework, which represents cracks as displacement discontinuities across a surface of zero measure. This report presents the relevant theory, mathematical framework, numerical modeling, and experimental investigations of through-thickness crack propagation in fiber-reinforced composites using the Variational Multiscale Cohesive Method (VMCM) developed by the authors.
Simulation of bright-field microscopy images depicting pap-smear specimen
Malm, Patrik; Brun, Anders; Bengtsson, Ewert
2015-01-01
As digital imaging is becoming a fundamental part of medical and biomedical research, the demand for computer-based evaluation using advanced image analysis is becoming an integral part of many research projects. A common problem when developing new image analysis algorithms is the need of large datasets with ground truth on which the algorithms can be tested and optimized. Generating such datasets is often tedious and introduces subjectivity and interindividual and intraindividual variations. An alternative to manually created ground-truth data is to generate synthetic images where the ground truth is known. The challenge then is to make the images sufficiently similar to the real ones to be useful in algorithm development. One of the first and most widely studied medical image analysis tasks is to automate screening for cervical cancer through Pap-smear analysis. As part of an effort to develop a new generation cervical cancer screening system, we have developed a framework for the creation of realistic synthetic bright-field microscopy images that can be used for algorithm development and benchmarking. The resulting framework has been assessed through a visual evaluation by experts with extensive experience of Pap-smear images. The results show that images produced using our described methods are realistic enough to be mistaken for real microscopy images. The developed simulation framework is very flexible and can be modified to mimic many other types of bright-field microscopy images. © 2015 The Authors. Published by Wiley Periodicals, Inc. on behalf of ISAC PMID:25573002
Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trebotich, D
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscousmore » flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.« less
Modeling complex biological flows in multi-scale systems using the APDEC framework
NASA Astrophysics Data System (ADS)
Trebotich, David
2006-09-01
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.
Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)
NASA Technical Reports Server (NTRS)
Gray, Justin S.; Briggs, Jeffery L.
2008-01-01
The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.
A QoS Framework with Traffic Request in Wireless Mesh Network
NASA Astrophysics Data System (ADS)
Fu, Bo; Huang, Hejiao
In this paper, we consider major issues in ensuring greater Quality-of-Service (QoS) in Wireless Mesh Networks (WMNs), specifically with regard to reliability and delay. To this end, we use traffic request to record QoS requirements of data flows. In order to achieve required QoS for all data flows efficiently and with high portability, we develop Network State Update Algorithm. All assumptions, definitions, and algorithms are made exclusively with WMNs in mind, guaranteeing the portability of our framework to various environments in WMNs. The simulation results in proof that our framework is correct.
Development of the CELSS Emulator at NASA JSC
NASA Technical Reports Server (NTRS)
Cullingford, Hatice S.
1989-01-01
The Controlled Ecological Life Support System (CELSS) Emulator is under development at the NASA Johnson Space Center (JSC) with the purpose to investigate computer simulations of integrated CELSS operations involving humans, plants, and process machinery. This paper describes Version 1.0 of the CELSS Emulator that was initiated in 1988 on the JSC Multi Purpose Applications Console Test Bed as the simulation framework. The run module of the simulation system now contains a CELSS model called BLSS. The CELSS Emulator makes it possible to generate model data sets, store libraries of results for further analysis, and also display plots of model variables as a function of time. The progress of the project is presented with sample test runs and simulation display pages.
An integrated computational tool for precipitation simulation
NASA Astrophysics Data System (ADS)
Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.
2011-07-01
Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.
NASA Astrophysics Data System (ADS)
Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.
2016-11-01
Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolly, S; Mutic, S; Anastasio, M
Purpose: Traditionally, image quality in radiation therapy is assessed subjectively or by utilizing physically-based metrics. Some model observers exist for task-based medical image quality assessment, but almost exclusively for diagnostic imaging tasks. As opposed to disease diagnosis, the task for image observers in radiation therapy is to utilize the available images to design and deliver a radiation dose which maximizes patient disease control while minimizing normal tissue damage. The purpose of this study was to design and implement a new computer simulation model observer to enable task-based image quality assessment in radiation therapy. Methods: A modular computer simulation framework wasmore » developed to resemble the radiotherapy observer by simulating an end-to-end radiation therapy treatment. Given images and the ground-truth organ boundaries from a numerical phantom as inputs, the framework simulates an external beam radiation therapy treatment and quantifies patient treatment outcomes using the previously defined therapeutic operating characteristic (TOC) curve. As a preliminary demonstration, TOC curves were calculated for various CT acquisition and reconstruction parameters, with the goal of assessing and optimizing simulation CT image quality for radiation therapy. Sources of randomness and bias within the system were analyzed. Results: The relationship between CT imaging dose and patient treatment outcome was objectively quantified in terms of a singular value, the area under the TOC (AUTOC) curve. The AUTOC decreases more rapidly for low-dose imaging protocols. AUTOC variation introduced by the dose optimization algorithm was approximately 0.02%, at the 95% confidence interval. Conclusion: A model observer has been developed and implemented to assess image quality based on radiation therapy treatment efficacy. It enables objective determination of appropriate imaging parameter values (e.g. imaging dose). Framework flexibility allows for incorporation of additional modules to include any aspect of the treatment process, and therefore has great potential for both assessment and optimization within radiation therapy.« less
A Framework for the Optimization of Discrete-Event Simulation Models
NASA Technical Reports Server (NTRS)
Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.
1996-01-01
With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.
Faunus: An object oriented framework for molecular simulation
Lund, Mikael; Trulsson, Martin; Persson, Björn
2008-01-01
Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331
Thermostating extended Lagrangian Born-Oppenheimer molecular dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martínez, Enrique; Cawkwell, Marc J.; Voter, Arthur F.
Here, Extended Lagrangian Born-Oppenheimer molecular dynamics is developed and analyzed for applications in canonical (NVT) simulations. Three different approaches are considered: the Nosé and Andersen thermostats and Langevin dynamics. We have tested the temperature distribution under different conditions of self-consistent field (SCF) convergence and time step and compared the results to analytical predictions. We find that the simulations based on the extended Lagrangian Born-Oppenheimer framework provide accurate canonical distributions even under approximate SCF convergence, often requiring only a single diagonalization per time step, whereas regular Born-Oppenheimer formulations exhibit unphysical fluctuations unless a sufficiently high degree of convergence is reached atmore » each time step. Lastly, the thermostated extended Lagrangian framework thus offers an accurate approach to sample processes in the canonical ensemble at a fraction of the computational cost of regular Born-Oppenheimer molecular dynamics simulations.« less
Thermostating extended Lagrangian Born-Oppenheimer molecular dynamics
Martínez, Enrique; Cawkwell, Marc J.; Voter, Arthur F.; ...
2015-04-21
Here, Extended Lagrangian Born-Oppenheimer molecular dynamics is developed and analyzed for applications in canonical (NVT) simulations. Three different approaches are considered: the Nosé and Andersen thermostats and Langevin dynamics. We have tested the temperature distribution under different conditions of self-consistent field (SCF) convergence and time step and compared the results to analytical predictions. We find that the simulations based on the extended Lagrangian Born-Oppenheimer framework provide accurate canonical distributions even under approximate SCF convergence, often requiring only a single diagonalization per time step, whereas regular Born-Oppenheimer formulations exhibit unphysical fluctuations unless a sufficiently high degree of convergence is reached atmore » each time step. Lastly, the thermostated extended Lagrangian framework thus offers an accurate approach to sample processes in the canonical ensemble at a fraction of the computational cost of regular Born-Oppenheimer molecular dynamics simulations.« less
eScience for molecular-scale simulations and the eMinerals project.
Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H
2009-03-13
We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.
NASA Astrophysics Data System (ADS)
Curtis, Christopher; Lenzo, Matthew; McClure, Matthew; Preiss, Bruce
2010-04-01
In order to anticipate the constantly changing landscape of global warfare, the United States Air Force must acquire new capabilities in the field of Intelligence, Surveillance, and Reconnaissance (ISR). To meet this challenge, the Air Force Research Laboratory (AFRL) is developing a unifying construct of "Layered Sensing" which will provide military decision-makers at all levels with the timely, actionable, and trusted information necessary for complete battlespace awareness. Layered Sensing is characterized by the appropriate combination of sensors and platforms (including those for persistent sensing), infrastructure, and exploitation capabilities to enable this synergistic awareness. To achieve the Layered Sensing vision, AFRL is pursuing a Modeling & Simulation (M&S) strategy through the Layered Sensing Operations Center (LSOC). An experimental ISR system-of-systems test-bed, the LSOC integrates DoD standard simulation tools with commercial, off-the-shelf video game technology for rapid scenario development and visualization. These tools will help facilitate sensor management performance characterization, system development, and operator behavioral analysis. Flexible and cost-effective, the LSOC will implement a non-proprietary, open-architecture framework with well-defined interfaces. This framework will incentivize the transition of current ISR performance models to service-oriented software design for maximum re-use and consistency. This paper will present the LSOC's development and implementation thus far as well as a summary of lessons learned and future plans for the LSOC.
A continuum treatment of sliding in Eulerian simulations of solid-solid and solid-fluid interfaces
NASA Astrophysics Data System (ADS)
Subramaniam, Akshay; Ghaisas, Niranjan; Lele, Sanjiva
2017-11-01
A novel treatment of sliding is developed for use in an Eulerian framework for simulating elastic-plastic deformations of solids coupled with fluids. In this method, embedded interfacial boundary conditions for perfect sliding are imposed by enforcing the interface normal to be a principal direction of the Cauchy stress and appropriate consistency conditions ensure correct transmission and reflection of waves at the interface. This sliding treatment may be used either to simulate a solid-solid sliding interface or to incorporate an internal slip boundary condition at a solid-fluid interface. Sliding laws like the Coulomb friction law can also be incorporated with relative ease into this framework. Simulations of sliding interfaces are conducted using a 10th order compact finite difference scheme and a Localized Artificial Diffusivity (LAD) scheme for shock and interface capturing. 1D and 2D simulations are used to assess the accuracy of the sliding treatment. The Richmyer-Meshkov instability between copper and aluminum is simulated with this sliding treatment as a demonstration test case. Support for this work was provided through Grant B612155 from the Lawrence Livermore National Laboratory, US Department of Energy.
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
ReSTART: A Novel Framework for Resource-Based Triage in Mass-Casualty Events.
Mills, Alex F; Argon, Nilay T; Ziya, Serhan; Hiestand, Brian; Winslow, James
2014-01-01
Current guidelines for mass-casualty triage do not explicitly use information about resource availability. Even though this limitation has been widely recognized, how it should be addressed remains largely unexplored. The authors present a novel framework developed using operations research methods to account for resource limitations when determining priorities for transportation of critically injured patients. To illustrate how this framework can be used, they also develop two specific example methods, named ReSTART and Simple-ReSTART, both of which extend the widely adopted triage protocol Simple Triage and Rapid Treatment (START) by using a simple calculation to determine priorities based on the relative scarcity of transportation resources. The framework is supported by three techniques from operations research: mathematical analysis, optimization, and discrete-event simulation. The authors? algorithms were developed using mathematical analysis and optimization and then extensively tested using 9,000 discrete-event simulations on three distributions of patient severity (representing low, random, and high acuity). For each incident, the expected number of survivors was calculated under START, ReSTART, and Simple-ReSTART. A web-based decision support tool was constructed to help providers make prioritization decisions in the aftermath of mass-casualty incidents based on ReSTART. In simulations, ReSTART resulted in significantly lower mortality than START regardless of which severity distribution was used (paired t test, p<.01). Mean decrease in critical mortality, the percentage of immediate and delayed patients who die, was 8.5% for low-acuity distribution (range ?2.2% to 21.1%), 9.3% for random distribution (range ?0.2% to 21.2%), and 9.1% for high-acuity distribution (range ?0.7% to 21.1%). Although the critical mortality improvement due to ReSTART was different for each of the three severity distributions, the variation was less than 1 percentage point, indicating that the ReSTART policy is relatively robust to different severity distributions. Taking resource limitations into account in mass-casualty situations, triage has the potential to increase the expected number of survivors. Further validation is required before field implementation; however, the framework proposed in here can serve as the foundation for future work in this area. 2014.
NASA Astrophysics Data System (ADS)
Johnston, J. M.
2013-12-01
Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.
Monte Carlo simulation of inverse geometry x-ray fluoroscopy using a modified MC-GPU framework
Dunkerley, David A. P.; Tomkowiak, Michael T.; Slagowski, Jordan M.; McCabe, Bradley P.; Funk, Tobias; Speidel, Michael A.
2015-01-01
Scanning-Beam Digital X-ray (SBDX) is a technology for low-dose fluoroscopy that employs inverse geometry x-ray beam scanning. To assist with rapid modeling of inverse geometry x-ray systems, we have developed a Monte Carlo (MC) simulation tool based on the MC-GPU framework. MC-GPU version 1.3 was modified to implement a 2D array of focal spot positions on a plane, with individually adjustable x-ray outputs, each producing a narrow x-ray beam directed toward a stationary photon-counting detector array. Geometric accuracy and blurring behavior in tomosynthesis reconstructions were evaluated from simulated images of a 3D arrangement of spheres. The artifact spread function from simulation agreed with experiment to within 1.6% (rRMSD). Detected x-ray scatter fraction was simulated for two SBDX detector geometries and compared to experiments. For the current SBDX prototype (10.6 cm wide by 5.3 cm tall detector), x-ray scatter fraction measured 2.8–6.4% (18.6–31.5 cm acrylic, 100 kV), versus 2.1–4.5% in MC simulation. Experimental trends in scatter versus detector size and phantom thickness were observed in simulation. For dose evaluation, an anthropomorphic phantom was imaged using regular and regional adaptive exposure (RAE) scanning. The reduction in kerma-area-product resulting from RAE scanning was 45% in radiochromic film measurements, versus 46% in simulation. The integral kerma calculated from TLD measurement points within the phantom was 57% lower when using RAE, versus 61% lower in simulation. This MC tool may be used to estimate tomographic blur, detected scatter, and dose distributions when developing inverse geometry x-ray systems. PMID:26113765
Monte Carlo simulation of inverse geometry x-ray fluoroscopy using a modified MC-GPU framework.
Dunkerley, David A P; Tomkowiak, Michael T; Slagowski, Jordan M; McCabe, Bradley P; Funk, Tobias; Speidel, Michael A
2015-02-21
Scanning-Beam Digital X-ray (SBDX) is a technology for low-dose fluoroscopy that employs inverse geometry x-ray beam scanning. To assist with rapid modeling of inverse geometry x-ray systems, we have developed a Monte Carlo (MC) simulation tool based on the MC-GPU framework. MC-GPU version 1.3 was modified to implement a 2D array of focal spot positions on a plane, with individually adjustable x-ray outputs, each producing a narrow x-ray beam directed toward a stationary photon-counting detector array. Geometric accuracy and blurring behavior in tomosynthesis reconstructions were evaluated from simulated images of a 3D arrangement of spheres. The artifact spread function from simulation agreed with experiment to within 1.6% (rRMSD). Detected x-ray scatter fraction was simulated for two SBDX detector geometries and compared to experiments. For the current SBDX prototype (10.6 cm wide by 5.3 cm tall detector), x-ray scatter fraction measured 2.8-6.4% (18.6-31.5 cm acrylic, 100 kV), versus 2.1-4.5% in MC simulation. Experimental trends in scatter versus detector size and phantom thickness were observed in simulation. For dose evaluation, an anthropomorphic phantom was imaged using regular and regional adaptive exposure (RAE) scanning. The reduction in kerma-area-product resulting from RAE scanning was 45% in radiochromic film measurements, versus 46% in simulation. The integral kerma calculated from TLD measurement points within the phantom was 57% lower when using RAE, versus 61% lower in simulation. This MC tool may be used to estimate tomographic blur, detected scatter, and dose distributions when developing inverse geometry x-ray systems.
Parallelizing Timed Petri Net simulations
NASA Technical Reports Server (NTRS)
Nicol, David M.
1993-01-01
The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.
Framework for multi-resolution analyses of advanced traffic management strategies.
DOT National Transportation Integrated Search
2016-11-01
Demand forecasting models and simulation models have been developed, calibrated, and used in isolation of each other. However, the advancement of transportation system technologies and strategies, the increase in the availability of data, and the unc...
Liang, Albert K.; Koniczek, Martin; Antonuk, Larry E.; El-Mohri, Youcef; Zhao, Qihua; Street, Robert A.; Lu, Jeng Ping
2017-01-01
Photon counting arrays (PCAs), defined as pixelated imagers which measure the absorbed energy of x-ray photons individually and record this information digitally, are of increasing clinical interest. A number of PCA prototypes with a 1 mm pixel-to-pixel pitch have recently been fabricated with polycrystalline silicon (poly-Si) — a thin-film technology capable of creating monolithic imagers of a size commensurate with human anatomy. In this study, analog and digital simulation frameworks were developed to provide insight into the influence of individual poly-Si transistors on pixel circuit performance — information that is not readily available through empirical means. The simulation frameworks were used to characterize the circuit designs employed in the prototypes. The analog framework, which determines the noise produced by individual transistors, was used to estimate energy resolution, as well as to identify which transistors contribute the most noise. The digital framework, which analyzes how well circuits function in the presence of significant variations in transistor properties, was used to estimate how fast a circuit can produce an output (referred to as output count rate). In addition, an algorithm was developed and used to estimate the minimum pixel pitch that could be achieved for the pixel circuits of the current prototypes. The simulation frameworks predict that the analog component of the PCA prototypes could have energy resolution as low as 8.9% FWHM at 70 keV; and the digital components should work well even in the presence of significant TFT variations, with the fastest component having output count rates as high as 3 MHz. Finally, based on conceivable improvements in the underlying fabrication process, the algorithm predicts that the 1 mm pitch of the current PCA prototypes could be reduced significantly, potentially to between ~240 and 290 μm. PMID:26878107
Liang, Albert K; Koniczek, Martin; Antonuk, Larry E; El-Mohri, Youcef; Zhao, Qihua; Street, Robert A; Lu, Jeng Ping
2016-03-07
Photon counting arrays (PCAs), defined as pixelated imagers which measure the absorbed energy of x-ray photons individually and record this information digitally, are of increasing clinical interest. A number of PCA prototypes with a 1 mm pixel-to-pixel pitch have recently been fabricated with polycrystalline silicon (poly-Si)-a thin-film technology capable of creating monolithic imagers of a size commensurate with human anatomy. In this study, analog and digital simulation frameworks were developed to provide insight into the influence of individual poly-Si transistors on pixel circuit performance-information that is not readily available through empirical means. The simulation frameworks were used to characterize the circuit designs employed in the prototypes. The analog framework, which determines the noise produced by individual transistors, was used to estimate energy resolution, as well as to identify which transistors contribute the most noise. The digital framework, which analyzes how well circuits function in the presence of significant variations in transistor properties, was used to estimate how fast a circuit can produce an output (referred to as output count rate). In addition, an algorithm was developed and used to estimate the minimum pixel pitch that could be achieved for the pixel circuits of the current prototypes. The simulation frameworks predict that the analog component of the PCA prototypes could have energy resolution as low as 8.9% full width at half maximum (FWHM) at 70 keV; and the digital components should work well even in the presence of significant thin-film transistor (TFT) variations, with the fastest component having output count rates as high as 3 MHz. Finally, based on conceivable improvements in the underlying fabrication process, the algorithm predicts that the 1 mm pitch of the current PCA prototypes could be reduced significantly, potentially to between ~240 and 290 μm.
A Generalized Decision Framework Using Multi-objective Optimization for Water Resources Planning
NASA Astrophysics Data System (ADS)
Basdekas, L.; Stewart, N.; Triana, E.
2013-12-01
Colorado Springs Utilities (CSU) is currently engaged in an Integrated Water Resource Plan (IWRP) to address the complex planning scenarios, across multiple time scales, currently faced by CSU. The modeling framework developed for the IWRP uses a flexible data-centered Decision Support System (DSS) with a MODSIM-based modeling system to represent the operation of the current CSU raw water system coupled with a state-of-the-art multi-objective optimization algorithm. Three basic components are required for the framework, which can be implemented for planning horizons ranging from seasonal to interdecadal. First, a water resources system model is required that is capable of reasonable system simulation to resolve performance metrics at the appropriate temporal and spatial scales of interest. The system model should be an existing simulation model, or one developed during the planning process with stakeholders, so that 'buy-in' has already been achieved. Second, a hydrologic scenario tool(s) capable of generating a range of plausible inflows for the planning period of interest is required. This may include paleo informed or climate change informed sequences. Third, a multi-objective optimization model that can be wrapped around the system simulation model is required. The new generation of multi-objective optimization models do not require parameterization which greatly reduces problem complexity. Bridging the gap between research and practice will be evident as we use a case study from CSU's planning process to demonstrate this framework with specific competing water management objectives. Careful formulation of objective functions, choice of decision variables, and system constraints will be discussed. Rather than treating results as theoretically Pareto optimal in a planning process, we use the powerful multi-objective optimization models as tools to more efficiently and effectively move out of the inferior decision space. The use of this framework will help CSU evaluate tradeoffs in a continually changing world.
NASA Astrophysics Data System (ADS)
Bechtold, S.; Höfle, B.
2016-06-01
In many technical domains of modern society, there is a growing demand for fast, precise and automatic acquisition of digital 3D models of a wide variety of physical objects and environments. Laser scanning is a popular and widely used technology to cover this demand, but it is also expensive and complex to use to its full potential. However, there might exist scenarios where the operation of a real laser scanner could be replaced by a computer simulation, in order to save time and costs. This includes scenarios like teaching and training of laser scanning, development of new scanner hardware and scanning methods, or generation of artificial scan data sets to support the development of point cloud processing and analysis algorithms. To test the feasibility of this idea, we have developed a highly flexible laser scanning simulation framework named Heidelberg LiDAR Operations Simulator (HELIOS). HELIOS is implemented as a Java library and split up into a core component and multiple extension modules. Extensible Markup Language (XML) is used to define scanner, platform and scene models and to configure the behaviour of modules. Modules were developed and implemented for (1) loading of simulation assets and configuration (i.e. 3D scene models, scanner definitions, survey descriptions etc.), (2) playback of XML survey descriptions, (3) TLS survey planning (i.e. automatic computation of recommended scanning positions) and (4) interactive real-time 3D visualization of simulated surveys. As a proof of concept, we show the results of two experiments: First, a survey planning test in a scene that was specifically created to evaluate the quality of the survey planning algorithm. Second, a simulated TLS scan of a crop field in a precision farming scenario. The results show that HELIOS fulfills its design goals.
Li, Tingting; Cheng, Zhengguo; Zhang, Le
2017-01-01
Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency. PMID:29194393
Li, Tingting; Cheng, Zhengguo; Zhang, Le
2017-12-01
Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency.
Computational discovery of metal-organic frameworks with high gas deliverable capacity
NASA Astrophysics Data System (ADS)
Bao, Yi
Metal-organic frameworks (MOFs) are a rapidly emerging class of nanoporous materials with largely tunable chemistry and diverse applications in gas storage, gas purification, catalysis, sensing and drug delivery. Efforts have been made to develop new MOFs with desirable properties both experimentally and computationally for decades. To guide experimental synthesis, we here develop a computational methodology to explore MOFs with high gas deliverable capacity. This de novo design procedure applies known chemical reactions, considers synthesizability and geometric requirements of organic linkers, and efficiently evolves a population of MOFs to optimize a desirable property. We identify 48 MOFs with higher methane deliverable capacity at 65-5.8 bar condition than the MOF-5 reference in nine networks. In a more comprehensive work, we predict two sets of MOFs with high methane deliverable capacity at a 65-5.8 bar loading-delivery condition or a 35-5.8 bar loading-delivery condition. We also optimize a set of MOFs with high methane accessible internal surface area to investigate the relationship between deliverable capacities and internal surface area. This methodology can be extended to MOFs with multiple types of linkers and multiple SBUs. Flexibile MOFs may allow for sophisticated heat management strategies and also provide higher gas deliverable capacity than rigid frameworks. We investigate flexible MOFs, such as MIL-53 families, and Fe(bdp) and Co(bdp) analogs, to understand the structural phase transition of frameworks and the resulting influence on heat of adsorption. Challenges of simulating a system with a flexible host structure and incoming guest molecules are discussed. Preliminary results from isotherm simulation using the hybrid MC/MD simulation scheme on MIL-53(Cr) are presented. Suggestions for proceeding to understand the free energy profile of flexible MOFs are provided.
Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in Flash
NASA Technical Reports Server (NTRS)
Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.
2012-01-01
In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.
Imposing a Lagrangian Particle Framework on an Eulerian Hydrodynamics Infrastructure in FLASH
NASA Astrophysics Data System (ADS)
Dubey, A.; Daley, C.; ZuHone, J.; Ricker, P. M.; Weide, K.; Graziani, C.
2012-08-01
In many astrophysical simulations, both Eulerian and Lagrangian quantities are of interest. For example, in a galaxy cluster merger simulation, the intracluster gas can have Eulerian discretization, while dark matter can be modeled using particles. FLASH, a component-based scientific simulation code, superimposes a Lagrangian framework atop an adaptive mesh refinement Eulerian framework to enable such simulations. The discretization of the field variables is Eulerian, while the Lagrangian entities occur in many different forms including tracer particles, massive particles, charged particles in particle-in-cell mode, and Lagrangian markers to model fluid-structure interactions. These widely varying roles for Lagrangian entities are possible because of the highly modular, flexible, and extensible architecture of the Lagrangian framework. In this paper, we describe the Lagrangian framework in FLASH in the context of two very different applications, Type Ia supernovae and galaxy cluster mergers, which use the Lagrangian entities in fundamentally different ways.
2014-09-18
and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems
Understanding Islamist political violence through computational social simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, Jennifer H; Mackerrow, Edward P; Patelli, Paolo G
Understanding the process that enables political violence is of great value in reducing the future demand for and support of violent opposition groups. Methods are needed that allow alternative scenarios and counterfactuals to be scientifically researched. Computational social simulation shows promise in developing 'computer experiments' that would be unfeasible or unethical in the real world. Additionally, the process of modeling and simulation reveals and challenges assumptions that may not be noted in theories, exposes areas where data is not available, and provides a rigorous, repeatable, and transparent framework for analyzing the complex dynamics of political violence. This paper demonstrates themore » computational modeling process using two simulation techniques: system dynamics and agent-based modeling. The benefits and drawbacks of both techniques are discussed. In developing these social simulations, we discovered that the social science concepts and theories needed to accurately simulate the associated psychological and social phenomena were lacking.« less
A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging
NASA Astrophysics Data System (ADS)
Solomon, Justin; Samei, Ehsan
2014-11-01
Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems.
A probabilistic model framework for evaluating year-to-year variation in crop productivity
NASA Astrophysics Data System (ADS)
Yokozawa, M.; Iizumi, T.; Tao, F.
2008-12-01
Most models describing the relation between crop productivity and weather condition have so far been focused on mean changes of crop yield. For keeping stable food supply against abnormal weather as well as climate change, evaluating the year-to-year variations in crop productivity rather than the mean changes is more essential. We here propose a new framework of probabilistic model based on Bayesian inference and Monte Carlo simulation. As an example, we firstly introduce a model on paddy rice production in Japan. It is called PRYSBI (Process- based Regional rice Yield Simulator with Bayesian Inference; Iizumi et al., 2008). The model structure is the same as that of SIMRIW, which was developed and used widely in Japan. The model includes three sub- models describing phenological development, biomass accumulation and maturing of rice crop. These processes are formulated to include response nature of rice plant to weather condition. This model inherently was developed to predict rice growth and yield at plot paddy scale. We applied it to evaluate the large scale rice production with keeping the same model structure. Alternatively, we assumed the parameters as stochastic variables. In order to let the model catch up actual yield at larger scale, model parameters were determined based on agricultural statistical data of each prefecture of Japan together with weather data averaged over the region. The posterior probability distribution functions (PDFs) of parameters included in the model were obtained using Bayesian inference. The MCMC (Markov Chain Monte Carlo) algorithm was conducted to numerically solve the Bayesian theorem. For evaluating the year-to-year changes in rice growth/yield under this framework, we firstly iterate simulations with set of parameter values sampled from the estimated posterior PDF of each parameter and then take the ensemble mean weighted with the posterior PDFs. We will also present another example for maize productivity in China. The framework proposed here provides us information on uncertainties, possibilities and limitations on future improvements in crop model as well.
Simulation-based optimization framework for reuse of agricultural drainage water in irrigation.
Allam, A; Tawfik, A; Yoshimura, C; Fleifle, A
2016-05-01
A simulation-based optimization framework for agricultural drainage water (ADW) reuse has been developed through the integration of a water quality model (QUAL2Kw) and a genetic algorithm. This framework was applied to the Gharbia drain in the Nile Delta, Egypt, in summer and winter 2012. First, the water quantity and quality of the drain was simulated using the QUAL2Kw model. Second, uncertainty analysis and sensitivity analysis based on Monte Carlo simulation were performed to assess QUAL2Kw's performance and to identify the most critical variables for determination of water quality, respectively. Finally, a genetic algorithm was applied to maximize the total reuse quantity from seven reuse locations with the condition not to violate the standards for using mixed water in irrigation. The water quality simulations showed that organic matter concentrations are critical management variables in the Gharbia drain. The uncertainty analysis showed the reliability of QUAL2Kw to simulate water quality and quantity along the drain. Furthermore, the sensitivity analysis showed that the 5-day biochemical oxygen demand, chemical oxygen demand, total dissolved solids, total nitrogen and total phosphorous are highly sensitive to point source flow and quality. Additionally, the optimization results revealed that the reuse quantities of ADW can reach 36.3% and 40.4% of the available ADW in the drain during summer and winter, respectively. These quantities meet 30.8% and 29.1% of the drainage basin requirements for fresh irrigation water in the respective seasons. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Systems Biology Framework for Modeling Metabolic Enzyme Inhibition of Mycobacterium Tuberculosis
2009-09-15
Quadri LE: Assembly of aryl-capped siderophores by modular peptide synthetases and polyketide synthases . Mol Microbiol 2000, 37:1-12. 51. Chou CJ...opportunities for therapeutic intervention. Results: We developed a mathematical framework to simulate the effects on the growth of a pathogen when enzymes in... on the growth of M. tuberculosis in a medium whose carbon source was restricted to fatty acids, and that of the 5’-O-(N-salicylsulfamoyl) adenosine
Arcade: A Web-Java Based Framework for Distributed Computing
NASA Technical Reports Server (NTRS)
Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)
2000-01-01
Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.
Drag Reduction of an Airfoil Using Deep Learning
NASA Astrophysics Data System (ADS)
Jiang, Chiyu; Sun, Anzhu; Marcus, Philip
2017-11-01
We reduced the drag of a 2D airfoil by starting with a NACA-0012 airfoil and used deep learning methods. We created a database which consists of simulations of 2D external flow over randomly generated shapes. We then developed a machine learning framework for external flow field inference given input shapes. Past work which utilized machine learning in Computational Fluid Dynamics focused on estimations of specific flow parameters, but this work is novel in the inference of entire flow fields. We further showed that learned flow patterns are transferable to cases that share certain similarities. This study illustrates the prospects of deeper integration of data-based modeling into current CFD simulation frameworks for faster flow inference and more accurate flow modeling.
Abiotic/biotic coupling in the rhizosphere: a reactive transport modeling analysis
Lawrence, Corey R.; Steefel, Carl; Maher, Kate
2014-01-01
A new generation of models is needed to adequately simulate patterns of soil biogeochemical cycling in response changing global environmental drivers. For example, predicting the influence of climate change on soil organic matter storage and stability requires models capable of addressing complex biotic/abiotic interactions of rhizosphere and weathering processes. Reactive transport modeling provides a powerful framework simulating these interactions and the resulting influence on soil physical and chemical characteristics. Incorporation of organic reactions in an existing reactive transport model framework has yielded novel insights into soil weathering and development but much more work is required to adequately capture root and microbial dynamics in the rhizosphere. This endeavor provides many advantages over traditional soil biogeochemical models but also many challenges.
DESCQA: An Automated Validation Framework for Synthetic Sky Catalogs
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Kovacs, Eve; Heitmann, Katrin; Uram, Thomas D.; Benson, Andrew J.; Campbell, Duncan; Cora, Sofía A.; DeRose, Joseph; Di Matteo, Tiziana; Habib, Salman; Hearin, Andrew P.; Bryce Kalmbach, J.; Krughoff, K. Simon; Lanusse, François; Lukić, Zarija; Mandelbaum, Rachel; Newman, Jeffrey A.; Padilla, Nelson; Paillas, Enrique; Pope, Adrian; Ricker, Paul M.; Ruiz, Andrés N.; Tenneti, Ananth; Vega-Martínez, Cristian A.; Wechsler, Risa H.; Zhou, Rongpu; Zu, Ying; The LSST Dark Energy Science Collaboration
2018-02-01
The use of high-quality simulated sky catalogs is essential for the success of cosmological surveys. The catalogs have diverse applications, such as investigating signatures of fundamental physics in cosmological observables, understanding the effect of systematic uncertainties on measured signals and testing mitigation strategies for reducing these uncertainties, aiding analysis pipeline development and testing, and survey strategy optimization. The list of applications is growing with improvements in the quality of the catalogs and the details that they can provide. Given the importance of simulated catalogs, it is critical to provide rigorous validation protocols that enable both catalog providers and users to assess the quality of the catalogs in a straightforward and comprehensive way. For this purpose, we have developed the DESCQA framework for the Large Synoptic Survey Telescope Dark Energy Science Collaboration as well as for the broader community. The goal of DESCQA is to enable the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. In this paper, we present the design concept and first implementation of DESCQA. In order to establish and demonstrate its full functionality we use a set of interim catalogs and validation tests. We highlight several important aspects, both technical and scientific, that require thoughtful consideration when designing a validation framework, including validation metrics and how these metrics impose requirements on the synthetic sky catalogs.
Simulating Sand Behavior through Terrain Subdivision and Particle Refinement
NASA Astrophysics Data System (ADS)
Clothier, M.
2013-12-01
Advances in computer graphics, GPUs, and parallel processing hardware have provided researchers with new methods to visualize scientific data. In fact, these advances have spurred new research opportunities between computer graphics and other disciplines, such as Earth sciences. Through collaboration, Earth and planetary scientists have benefited by using these advances in hardware technology to process large amounts of data for visualization and analysis. At Oregon State University, we are collaborating with the Oregon Space Grant and IGERT Ecosystem Informatics programs to investigate techniques for simulating the behavior of sand. In addition, we have also been collaborating with the Jet Propulsion Laboratory's DARTS Lab to exchange ideas on our research. The DARTS Lab specializes in the simulation of planetary vehicles, such as the Mars rovers. One aspect of their work is testing these vehicles in a virtual "sand box" to test their performance in different environments. Our research builds upon this idea to create a sand simulation framework to allow for more complex and diverse environments. As a basis for our framework, we have focused on planetary environments, such as the harsh, sandy regions on Mars. To evaluate our framework, we have used simulated planetary vehicles, such as a rover, to gain insight into the performance and interaction between the surface sand and the vehicle. Unfortunately, simulating the vast number of individual sand particles and their interaction with each other has been a computationally complex problem in the past. However, through the use of high-performance computing, we have developed a technique to subdivide physically active terrain regions across a large landscape. To achieve this, we only subdivide terrain regions where sand particles are actively participating with another object or force, such as a rover wheel. This is similar to a Level of Detail (LOD) technique, except that the density of subdivisions are determined by their proximity to the interacting object or force with the sand. To illustrate an example, as a rover wheel moves forward and approaches a particular sand region, that region will continue to subdivide until individual sand particles are represented. Conversely, if the rover wheel moves away, previously subdivided sand regions will recombine. Thus, individual sand particles are available when an interacting force is present but stored away if there is not. As such, this technique allows for many particles to be represented without the computational complexity. We have also further generalized these subdivision regions in our sand framework into any volumetric area suitable for use in the simulation. This allows for more compact subdivision regions and has fine-tuned our framework so that more emphasis can be placed on regions of actively participating sand. We feel that this increases the framework's usefulness across scientific applications and can provide for other research opportunities within the earth and planetary sciences. Through continued collaboration with our academic partners, we continue to build upon our sand simulation framework and look for other opportunities to utilize this research.
Martin, Angel; Whiteman, C.D.
1999-01-01
Existing data on water levels, water use, water quality, and aquifer properties were used to construct a multilayer digital model to simulate flow in the aquifer system. The report describes the geohydrologic framework of the aquifer system, and the development, calibration, and sensitivity analysis of the ground-water-flow model, but it is primarily focused on the results of the simulations that show the natural flow of ground water throughout the regional aquifer system and the changes from the natural flow caused by development of ground-water supplies.
Quantitative modeling of soil genesis processes
NASA Technical Reports Server (NTRS)
Levine, E. R.; Knox, R. G.; Kerber, A. G.
1992-01-01
For fine spatial scale simulation, a model is being developed to predict changes in properties over short-, meso-, and long-term time scales within horizons of a given soil profile. Processes that control these changes can be grouped into five major process clusters: (1) abiotic chemical reactions; (2) activities of organisms; (3) energy balance and water phase transitions; (4) hydrologic flows; and (5) particle redistribution. Landscape modeling of soil development is possible using digitized soil maps associated with quantitative soil attribute data in a geographic information system (GIS) framework to which simulation models are applied.
Status of the MIND simulation and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cervera Villanueva, A.; Martin-Albo, J.; Laing, A.
2010-03-30
A realistic simulation of the Neutrino Factory detectors is required in order to fully understand the sensitivity of such a facility to the remaining parameters and degeneracies of the neutrino mixing matrix. Here described is the status of a modular software framework being developed to accommodate such a study. The results of initial studies of the reconstruction software and expected efficiency curves in the context of the golden channel are given.
Scalable High-order Methods for Multi-Scale Problems: Analysis, Algorithms and Application
2016-02-26
Karniadakis, “Resilient algorithms for reconstructing and simulating gappy flow fields in CFD ”, Fluid Dynamic Research, vol. 47, 051402, 2015. 2. Y. Yu, H...simulation, domain decomposition, CFD , gappy data, estimation theory, and gap-tooth algorithm. 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...objective of this project was to develop a general CFD framework for multifidelity simula- tions to target multiscale problems but also resilience in
NASA Astrophysics Data System (ADS)
Brekke, L. D.; Scott, J.; Ferguson, I. M.; Arnold, J.; Raff, D. A.; Webb, R. S.
2012-12-01
Water managers need to understand the applicability of climate projection information available for decision-support at the scale of their applications. Applicability depends on information reliability and relevance. This need to understand applicability stems from expectations that entities rationalize adaptation investments or decisions to delay investment. It is also occurring at a time when new global climate projections are being released through the World Climate Research Programme Coupled Model Intercomparison Project phase 5 (CMIP5), which introduces new information opportunities and interpretation challenges. This project involves an interagency collaboration to evaluate the applicability of CMIP5 projections for use in water and environmental resources planning. The overarching goal is to develop and demonstrate a framework that involves dual evaluations of relevance and reliability informing an ultimate discussion and judgment of applicability, which is expected to vary with decision-making context. The framework is being developed and demonstrated within the context of reservoir systems management in California's Sacramento and San Joaquin River basins. The relevance evaluation focuses on identifying the climate variables and statistical measures relevant to long-term management questions, which may depend on satisfying multiple objectives. Past studies' results are being considered in this evaluation, along with new results from system sensitivity analyses conducted through this effort. The reliability evaluation focuses on the CMIP5 climate models' ability to simulate past conditions relative to observed references. The evaluation is being conducted across the global domain using a large menu of climate variables and statistical measures, leveraging lessons learned from similar evaluations of CMIP3 climate models. The global focus addresses a broader project goal of producing a web resource that can serve reliability information to applicability discussions around the world, with evaluation results being served through a web-portal similar to that developed by NOAA/CIRES to serve CMIP3 information on future climate extremes (http://www.esrl.noaa.gov/psd/ipcc/extremes/). The framework concludes with an applicability discussion informed by relevance and reliability results. The goal is to observe the discussion process and identify features, choice points, and challenges that might be summarized and shared with other resource management groups facing applicability questions. This presentation will discuss the project framework and preliminary results. In addition to considering CMIP5 21st century projection information, the framework is being developed to support evaluation of CMIP5 decadal predictability experiment simulations and reconcile those simulations with 21st century projections. The presentation will also discuss implications of considering the applicability of bias-corrected and downscaled information within this framework.
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
NASA Astrophysics Data System (ADS)
Saunders, Vance M.
1999-06-01
The downsizing of the Department of Defense (DoD) and the associated reduction in budgets has re-emphasized the need for commonality, reuse, and standards with respect to the way DoD does business. DoD has implemented significant changes in how it buys weapon systems. The new emphasis is on concurrent engineering with Integrated Product and Process Development and collaboration with Integrated Product Teams. The new DoD vision includes Simulation Based Acquisition (SBA), a process supported by robust, collaborative use of simulation technology that is integrated across acquisition phases and programs. This paper discusses the Air Force Research Laboratory's efforts to use Modeling and Simulation (M&S) resources within a Collaborative Enterprise Environment to support SBA and other Collaborative Enterprise and Virtual Prototyping (CEVP) applications. The paper will discuss four technology areas: (1) a Processing Ontology that defines a hierarchically nested set of collaboration contexts needed to organize and support multi-disciplinary collaboration using M&S, (2) a partial taxonomy of intelligent agents needed to manage different M&S resource contributions to advancing the state of product development, (3) an agent- based process for interfacing disparate M&S resources into a CEVP framework, and (4) a Model-View-Control based approach to defining `a new way of doing business' for users of CEVP frameworks/systems.
plasmaFoam: An OpenFOAM framework for computational plasma physics and chemistry
NASA Astrophysics Data System (ADS)
Venkattraman, Ayyaswamy; Verma, Abhishek Kumar
2016-09-01
As emphasized in the 2012 Roadmap for low temperature plasmas (LTP), scientific computing has emerged as an essential tool for the investigation and prediction of the fundamental physical and chemical processes associated with these systems. While several in-house and commercial codes exist, with each having its own advantages and disadvantages, a common framework that can be developed by researchers from all over the world will likely accelerate the impact of computational studies on advances in low-temperature plasma physics and chemistry. In this regard, we present a finite volume computational toolbox to perform high-fidelity simulations of LTP systems. This framework, primarily based on the OpenFOAM solver suite, allows us to enhance our understanding of multiscale plasma phenomenon by performing massively parallel, three-dimensional simulations on unstructured meshes using well-established high performance computing tools that are widely used in the computational fluid dynamics community. In this talk, we will present preliminary results obtained using the OpenFOAM-based solver suite with benchmark three-dimensional simulations of microplasma devices including both dielectric and plasma regions. We will also discuss the future outlook for the solver suite.
Haron, Zaiton; Bakar, Suhaimi Abu; Dimon, Mohamad Ngasri
2015-01-01
Strategic noise mapping provides important information for noise impact assessment and noise abatement. However, producing reliable strategic noise mapping in a dynamic, complex working environment is difficult. This study proposes the implementation of the random walk approach as a new stochastic technique to simulate noise mapping and to predict the noise exposure level in a workplace. A stochastic simulation framework and software, namely RW-eNMS, were developed to facilitate the random walk approach in noise mapping prediction. This framework considers the randomness and complexity of machinery operation and noise emission levels. Also, it assesses the impact of noise on the workers and the surrounding environment. For data validation, three case studies were conducted to check the accuracy of the prediction data and to determine the efficiency and effectiveness of this approach. The results showed high accuracy of prediction results together with a majority of absolute differences of less than 2 dBA; also, the predicted noise doses were mostly in the range of measurement. Therefore, the random walk approach was effective in dealing with environmental noises. It could predict strategic noise mapping to facilitate noise monitoring and noise control in the workplaces. PMID:25875019
Decision Manifold Approximation for Physics-Based Simulations
NASA Technical Reports Server (NTRS)
Wong, Jay Ming; Samareh, Jamshid A.
2016-01-01
With the recent surge of success in big-data driven deep learning problems, many of these frameworks focus on the notion of architecture design and utilizing massive databases. However, in some scenarios massive sets of data may be difficult, and in some cases infeasible, to acquire. In this paper we discuss a trajectory-based framework that quickly learns the underlying decision manifold of binary simulation classifications while judiciously selecting exploratory target states to minimize the number of required simulations. Furthermore, we draw particular attention to the simulation prediction application idealized to the case where failures in simulations can be predicted and avoided, providing machine intelligence to novice analysts. We demonstrate this framework in various forms of simulations and discuss its efficacy.
NASA Astrophysics Data System (ADS)
Pickl, Kristina; Pande, Jayant; Köstler, Harald; Rüde, Ulrich; Smith, Ana-Sunčana
2017-03-01
Propulsion at low Reynolds numbers is often studied by defining artificial microswimmers which exhibit a particular stroke. The disadvantage of such an approach is that the stroke does not adjust to the environment, in particular the fluid flow, which can diminish the effect of hydrodynamic interactions. To overcome this limitation, we simulate a microswimmer consisting of three beads connected by springs and dampers, using the self-developed waLBerla and pe framework based on the lattice Boltzmann method and the discrete element method. In our approach, the swimming stroke of a swimmer emerges as a balance of the drag, the driving and the elastic internal forces. We validate the simulations by comparing the obtained swimming velocity to the velocity found analytically using a perturbative method where the bead oscillations are taken to be small. Including higher-order terms in the hydrodynamic interactions between the beads improves the agreement to the simulations in parts of the parameter space. Encouraged by the agreement between the theory and the simulations and aided by the massively parallel capabilities of the waLBerla-pe framework, we simulate more than ten thousand such swimmers together, thus presenting the first fully resolved simulations of large swarms with active responsive components.
A Hybrid Multiscale Framework for Subsurface Flow and Transport Simulations
Scheibe, Timothy D.; Yang, Xiaofan; Chen, Xingyuan; ...
2015-06-01
Extensive research efforts have been invested in reducing model errors to improve the predictive ability of biogeochemical earth and environmental system simulators, with applications ranging from contaminant transport and remediation to impacts of biogeochemical elemental cycling (e.g., carbon and nitrogen) on local ecosystems and regional to global climate. While the bulk of this research has focused on improving model parameterizations in the face of observational limitations, the more challenging type of model error/uncertainty to identify and quantify is model structural error which arises from incorrect mathematical representations of (or failure to consider) important physical, chemical, or biological processes, properties, ormore » system states in model formulations. While improved process understanding can be achieved through scientific study, such understanding is usually developed at small scales. Process-based numerical models are typically designed for a particular characteristic length and time scale. For application-relevant scales, it is generally necessary to introduce approximations and empirical parameterizations to describe complex systems or processes. This single-scale approach has been the best available to date because of limited understanding of process coupling combined with practical limitations on system characterization and computation. While computational power is increasing significantly and our understanding of biological and environmental processes at fundamental scales is accelerating, using this information to advance our knowledge of the larger system behavior requires the development of multiscale simulators. Accordingly there has been much recent interest in novel multiscale methods in which microscale and macroscale models are explicitly coupled in a single hybrid multiscale simulation. A limited number of hybrid multiscale simulations have been developed for biogeochemical earth systems, but they mostly utilize application-specific and sometimes ad-hoc approaches for model coupling. We are developing a generalized approach to hierarchical model coupling designed for high-performance computational systems, based on the Swift computing workflow framework. In this presentation we will describe the generalized approach and provide two use cases: 1) simulation of a mixing-controlled biogeochemical reaction coupling pore- and continuum-scale models, and 2) simulation of biogeochemical impacts of groundwater – river water interactions coupling fine- and coarse-grid model representations. This generalized framework can be customized for use with any pair of linked models (microscale and macroscale) with minimal intrusiveness to the at-scale simulators. It combines a set of python scripts with the Swift workflow environment to execute a complex multiscale simulation utilizing an approach similar to the well-known Heterogeneous Multiscale Method. User customization is facilitated through user-provided input and output file templates and processing function scripts, and execution within a high-performance computing environment is handled by Swift, such that minimal to no user modification of at-scale codes is required.« less
MCdevelop - a universal framework for Stochastic Simulations
NASA Astrophysics Data System (ADS)
Slawinska, M.; Jadach, S.
2011-03-01
We present MCdevelop, a universal computer framework for developing and exploiting the wide class of Stochastic Simulations (SS) software. This powerful universal SS software development tool has been derived from a series of scientific projects for precision calculations in high energy physics (HEP), which feature a wide range of functionality in the SS software needed for advanced precision Quantum Field Theory calculations for the past LEP experiments and for the ongoing LHC experiments at CERN, Geneva. MCdevelop is a "spin-off" product of HEP to be exploited in other areas, while it will still serve to develop new SS software for HEP experiments. Typically SS involve independent generation of large sets of random "events", often requiring considerable CPU power. Since SS jobs usually do not share memory it makes them easy to parallelize. The efficient development, testing and running in parallel SS software requires a convenient framework to develop software source code, deploy and monitor batch jobs, merge and analyse results from multiple parallel jobs, even before the production runs are terminated. Throughout the years of development of stochastic simulations for HEP, a sophisticated framework featuring all the above mentioned functionality has been implemented. MCdevelop represents its latest version, written mostly in C++ (GNU compiler gcc). It uses Autotools to build binaries (optionally managed within the KDevelop 3.5.3 Integrated Development Environment (IDE)). It uses the open-source ROOT package for histogramming, graphics and the mechanism of persistency for the C++ objects. MCdevelop helps to run multiple parallel jobs on any computer cluster with NQS-type batch system. Program summaryProgram title:MCdevelop Catalogue identifier: AEHW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 48 136 No. of bytes in distributed program, including test data, etc.: 355 698 Distribution format: tar.gz Programming language: ANSI C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system. Operating system: Most UNIX systems, Linux. The application programs were thoroughly tested under Ubuntu 7.04, 8.04 and CERN Scientific Linux 5. Has the code been vectorised or parallelised?: Tools (scripts) for optional parallelisation on a PC farm are included. RAM: 500 bytes Classification: 11.3 External routines: ROOT package version 5.0 or higher ( http://root.cern.ch/drupal/). Nature of problem: Developing any type of stochastic simulation program for high energy physics and other areas. Solution method: Object Oriented programming in C++ with added persistency mechanism, batch scripts for running on PC farms and Autotools.
High Spatial Resolution Multi-Organ Finite Element Modeling of Ventricular-Arterial Coupling
Shavik, Sheikh Mohammad; Jiang, Zhenxiang; Baek, Seungik; Lee, Lik Chuan
2018-01-01
While it has long been recognized that bi-directional interaction between the heart and the vasculature plays a critical role in the proper functioning of the cardiovascular system, a comprehensive study of this interaction has largely been hampered by a lack of modeling framework capable of simultaneously accommodating high-resolution models of the heart and vasculature. Here, we address this issue and present a computational modeling framework that couples finite element (FE) models of the left ventricle (LV) and aorta to elucidate ventricular—arterial coupling in the systemic circulation. We show in a baseline simulation that the framework predictions of (1) LV pressure—volume loop, (2) aorta pressure—diameter relationship, (3) pressure—waveforms of the aorta, LV, and left atrium (LA) over the cardiac cycle are consistent with the physiological measurements found in healthy human. To develop insights of ventricular-arterial interactions, the framework was then used to simulate how alterations in the geometrical or, material parameter(s) of the aorta affect the LV and vice versa. We show that changing the geometry and microstructure of the aorta model in the framework led to changes in the functional behaviors of both LV and aorta that are consistent with experimental observations. On the other hand, changing contractility and passive stiffness of the LV model in the framework also produced changes in both the LV and aorta functional behaviors that are consistent with physiology principles. PMID:29551977
Common modeling system for digital simulation
NASA Technical Reports Server (NTRS)
Painter, Rick
1994-01-01
The Joint Modeling and Simulation System is a tri-service investigation into a common modeling framework for the development digital models. The basis for the success of this framework is an X-window-based, open systems architecture, object-based/oriented methodology, standard interface approach to digital model construction, configuration, execution, and post processing. For years Department of Defense (DOD) agencies have produced various weapon systems/technologies and typically digital representations of the systems/technologies. These digital representations (models) have also been developed for other reasons such as studies and analysis, Cost Effectiveness Analysis (COEA) tradeoffs, etc. Unfortunately, there have been no Modeling and Simulation (M&S) standards, guidelines, or efforts towards commonality in DOD M&S. The typical scenario is an organization hires a contractor to build hardware and in doing so an digital model may be constructed. Until recently, this model was not even obtained by the organization. Even if it was procured, it was on a unique platform, in a unique language, with unique interfaces, and, with the result being UNIQUE maintenance required. Additionally, the constructors of the model expended more effort in writing the 'infrastructure' of the model/simulation (e.g. user interface, database/database management system, data journalizing/archiving, graphical presentations, environment characteristics, other components in the simulation, etc.) than in producing the model of the desired system. Other side effects include: duplication of efforts; varying assumptions; lack of credibility/validation; and decentralization in policy and execution. J-MASS provides the infrastructure, standards, toolset, and architecture to permit M&S developers and analysts to concentrate on the their area of interest.
DOT National Transportation Integrated Search
2011-08-01
The Backing crash Countermeasures project, part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program, developed a basic methodological framework and computerbased simulation model to estimate the effectiv...
NASA Astrophysics Data System (ADS)
El-Wardany, Tahany; Lynch, Mathew; Gu, Wenjiong; Hsu, Arthur; Klecka, Michael; Nardi, Aaron; Viens, Daniel
This paper proposes an optimization framework enabling the integration of multi-scale / multi-physics simulation codes to perform structural optimization design for additively manufactured components. Cold spray was selected as the additive manufacturing (AM) process and its constraints were identified and included in the optimization scheme. The developed framework first utilizes topology optimization to maximize stiffness for conceptual design. The subsequent step applies shape optimization to refine the design for stress-life fatigue. The component weight was reduced by 20% while stresses were reduced by 75% and the rigidity was improved by 37%. The framework and analysis codes were implemented using Altair software as well as an in-house loading code. The optimized design was subsequently produced by the cold spray process.
Semantics-enabled service discovery framework in the SIMDAT pharma grid.
Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert
2008-03-01
We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.
Integrated health monitoring and controls for rocket engines
NASA Technical Reports Server (NTRS)
Merrill, W. C.; Musgrave, J. L.; Guo, T. H.
1992-01-01
Current research in intelligent control systems at the Lewis Research Center is described in the context of a functional framework. The framework is applicable to a variety of reusable space propulsion systems for existing and future launch vehicles. It provides a 'road map' technology development to enable enhanced engine performance with increased reliability, durability, and maintainability. The framework hierarchy consists of a mission coordination level, a propulsion system coordination level, and an engine control level. Each level is described in the context of the Space Shuttle Main Engine. The concept of integrating diagnostics with control is discussed within the context of the functional framework. A distributed real time simulation testbed is used to realize and evaluate the functionalities in closed loop.
Zhang, X; Duan, J; Kesisoglou, F; Novakovic, J; Amidon, G L; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-08-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled "Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation." The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole-body framework. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
A Framework for Simulating Turbine-Based Combined-Cycle Inlet Mode-Transition
NASA Technical Reports Server (NTRS)
Le, Dzu K.; Vrnak, Daniel R.; Slater, John W.; Hessel, Emil O.
2012-01-01
A simulation framework based on the Memory-Mapped-Files technique was created to operate multiple numerical processes in locked time-steps and send I/O data synchronously across to one-another to simulate system-dynamics. This simulation scheme is currently used to study the complex interactions between inlet flow-dynamics, variable-geometry actuation mechanisms, and flow-controls in the transition from the supersonic to hypersonic conditions and vice-versa. A study of Mode-Transition Control for a high-speed inlet wind-tunnel model with this MMF-based framework is presented to illustrate this scheme and demonstrate its usefulness in simulating supersonic and hypersonic inlet dynamics and controls or other types of complex systems.
Cyber Security Research Frameworks For Coevolutionary Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rush, George D.; Tauritz, Daniel Remy
Several architectures have been created for developing and testing systems used in network security, but most are meant to provide a platform for running cyber security experiments as opposed to automating experiment processes. In the first paper, we propose a framework termed Distributed Cyber Security Automation Framework for Experiments (DCAFE) that enables experiment automation and control in a distributed environment. Predictive analysis of adversaries is another thorny issue in cyber security. Game theory can be used to mathematically analyze adversary models, but its scalability limitations restrict its use. Computational game theory allows us to scale classical game theory to larger,more » more complex systems. In the second paper, we propose a framework termed Coevolutionary Agent-based Network Defense Lightweight Event System (CANDLES) that can coevolve attacker and defender agent strategies and capabilities and evaluate potential solutions with a custom network defense simulation. The third paper is a continuation of the CANDLES project in which we rewrote key parts of the framework. Attackers and defenders have been redesigned to evolve pure strategy, and a new network security simulation is devised which specifies network architecture and adds a temporal aspect. We also add a hill climber algorithm to evaluate the search space and justify the use of a coevolutionary algorithm.« less
Towards a Theoretical Framework for Educational Simulations.
ERIC Educational Resources Information Center
Winer, Laura R.; Vazquez-Abad, Jesus
1981-01-01
Discusses the need for a sustained and systematic effort toward establishing a theoretical framework for educational simulations, proposes the adaptation of models borrowed from the natural and applied sciences, and describes three simulations based on such a model adapted using Brunerian learning theory. Sixteen references are listed. (LLS)
Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale
Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv
2015-01-01
X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner. PMID:26169570
Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale
NASA Astrophysics Data System (ADS)
Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv
2015-07-01
X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner.
Competency-Based Training and Simulation: Making a "Valid" Argument.
Noureldin, Yasser A; Lee, Jason Y; McDougall, Elspeth M; Sweet, Robert M
2018-02-01
The use of simulation as an assessment tool is much more controversial than is its utility as an educational tool. However, without valid simulation-based assessment tools, the ability to objectively assess technical skill competencies in a competency-based medical education framework will remain challenging. The current literature in urologic simulation-based training and assessment uses a definition and framework of validity that is now outdated. This is probably due to the absence of awareness rather than an absence of comprehension. The following review article provides the urologic community an updated taxonomy on validity theory as it relates to simulation-based training and assessments and translates our simulation literature to date into this framework. While the old taxonomy considered validity as distinct subcategories and focused on the simulator itself, the modern taxonomy, for which we translate the literature evidence, considers validity as a unitary construct with a focus on interpretation of simulator data/scores.
Improving Fidelity of Launch Vehicle Liftoff Acoustic Simulations
NASA Technical Reports Server (NTRS)
Liever, Peter; West, Jeff
2016-01-01
Launch vehicles experience high acoustic loads during ignition and liftoff affected by the interaction of rocket plume generated acoustic waves with launch pad structures. Application of highly parallelized Computational Fluid Dynamics (CFD) analysis tools optimized for application on the NAS computer systems such as the Loci/CHEM program now enable simulation of time-accurate, turbulent, multi-species plume formation and interaction with launch pad geometry and capture the generation of acoustic noise at the source regions in the plume shear layers and impingement regions. These CFD solvers are robust in capturing the acoustic fluctuations, but they are too dissipative to accurately resolve the propagation of the acoustic waves throughout the launch environment domain along the vehicle. A hybrid Computational Fluid Dynamics and Computational Aero-Acoustics (CFD/CAA) modeling framework has been developed to improve such liftoff acoustic environment predictions. The framework combines the existing highly-scalable NASA production CFD code, Loci/CHEM, with a high-order accurate discontinuous Galerkin (DG) solver, Loci/THRUST, developed in the same computational framework. Loci/THRUST employs a low dissipation, high-order, unstructured DG method to accurately propagate acoustic waves away from the source regions across large distances. The DG solver is currently capable of solving up to 4th order solutions for non-linear, conservative acoustic field propagation. Higher order boundary conditions are implemented to accurately model the reflection and refraction of acoustic waves on launch pad components. The DG solver accepts generalized unstructured meshes, enabling efficient application of common mesh generation tools for CHEM and THRUST simulations. The DG solution is coupled with the CFD solution at interface boundaries placed near the CFD acoustic source regions. Both simulations are executed simultaneously with coordinated boundary condition data exchange.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Xiaofeng; Schimel, Joshua; Thornton, Peter E
2014-01-01
Microbial assimilation of soil organic carbon is one of the fundamental processes of global carbon cycling and it determines the magnitude of microbial biomass in soils. Mechanistic understanding of microbial assimilation of soil organic carbon and its controls is important for to improve Earth system models ability to simulate carbon-climate feedbacks. Although microbial assimilation of soil organic carbon is broadly considered to be an important parameter, it really comprises two separate physiological processes: one-time assimilation efficiency and time-dependent microbial maintenance energy. Representing of these two mechanisms is crucial to more accurately simulate carbon cycling in soils. In this study, amore » simple modeling framework was developed to evaluate the substrate and environmental controls on microbial assimilation of soil organic carbon using a new term: microbial annual active period (the length of microbes remaining active in one year). Substrate quality has a positive effect on microbial assimilation of soil organic carbon: higher substrate quality (lower C:N ratio) leads to higher ratio of microbial carbon to soil organic carbon and vice versa. Increases in microbial annual active period from zero stimulate microbial assimilation of soil organic carbon; however, when microbial annual active period is longer than an optimal threshold, increasing this period decreases microbial biomass. The simulated ratios of soil microbial biomass to soil organic carbon are reasonably consistent with a recently compiled global dataset at the biome-level. The modeling framework of microbial assimilation of soil organic carbon and its controls developed in this study offers an applicable ways to incorporate microbial contributions to the carbon cycling into Earth system models for simulating carbon-climate feedbacks and to explain global patterns of microbial biomass.« less
Ong, Carmichael F.; Hicks, Jennifer L.; Delp, Scott L.
2017-01-01
Goal Technologies that augment human performance are the focus of intensive research and development, driven by advances in wearable robotic systems. Success has been limited by the challenge of understanding human–robot interaction. To address this challenge, we developed an optimization framework to synthesize a realistic human standing long jump and used the framework to explore how simulated wearable robotic devices might enhance jump performance. Methods A planar, five-segment, seven-degree-of-freedom model with physiological torque actuators, which have variable torque capacity depending on joint position and velocity, was used to represent human musculoskeletal dynamics. An active augmentation device was modeled as a torque actuator that could apply a single pulse of up to 100 Nm of extension torque. A passive design was modeled as rotational springs about each lower limb joint. Dynamic optimization searched for physiological and device actuation patterns to maximize jump distance. Results Optimization of the nominal case yielded a 2.27 m jump that captured salient kinematic and kinetic features of human jumps. When the active device was added to the ankle, knee, or hip, jump distance increased to between 2.49 and 2.52 m. Active augmentation of all three joints increased the jump distance to 3.10 m. The passive design increased jump distance to 3.32 m by adding torques of 135 Nm, 365 Nm, and 297 Nm to the ankle, knee, and hip, respectively. Conclusion Dynamic optimization can be used to simulate a standing long jump and investigate human-robot interaction. Significance Simulation can aid in the design of performance-enhancing technologies. PMID:26258930
Framework for Architecture Trade Study Using MBSE and Performance Simulation
NASA Technical Reports Server (NTRS)
Ryan, Jessica; Sarkani, Shahram; Mazzuchim, Thomas
2012-01-01
Increasing complexity in modern systems as well as cost and schedule constraints require a new paradigm of system engineering to fulfill stakeholder needs. Challenges facing efficient trade studies include poor tool interoperability, lack of simulation coordination (design parameters) and requirements flowdown. A recent trend toward Model Based System Engineering (MBSE) includes flexible architecture definition, program documentation, requirements traceability and system engineering reuse. As a new domain MBSE still lacks governing standards and commonly accepted frameworks. This paper proposes a framework for efficient architecture definition using MBSE in conjunction with Domain Specific simulation to evaluate trade studies. A general framework is provided followed with a specific example including a method for designing a trade study, defining candidate architectures, planning simulations to fulfill requirements and finally a weighted decision analysis to optimize system objectives.
Wilmoth, Jared L; Doak, Peter W; Timm, Andrea; Halsted, Michelle; Anderson, John D; Ginovart, Marta; Prats, Clara; Portell, Xavier; Retterer, Scott T; Fuentes-Cabrera, Miguel
2018-01-01
The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P . aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density and local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models.
Wilmoth, Jared L.; Doak, Peter W.; Timm, Andrea; Halsted, Michelle; Anderson, John D.; Ginovart, Marta; Prats, Clara; Portell, Xavier; Retterer, Scott T.; Fuentes-Cabrera, Miguel
2018-01-01
The factors leading to changes in the organization of microbial assemblages at fine spatial scales are not well characterized or understood. However, they are expected to guide the succession of community development and function toward specific outcomes that could impact human health and the environment. In this study, we put forward a combined experimental and agent-based modeling framework and use it to interpret unique spatial organization patterns of H1-Type VI secretion system (T6SS) mutants of P. aeruginosa under spatial confinement. We find that key parameters, such as T6SS-mediated cell contact and lysis, spatial localization, relative species abundance, cell density and local concentrations of growth substrates and metabolites are influenced by spatial confinement. The model, written in the accessible programming language NetLogo, can be adapted to a variety of biological systems of interest and used to simulate experiments across a broad parameter space. It was implemented and run in a high-throughput mode by deploying it across multiple CPUs, with each simulation representing an individual well within a high-throughput microwell array experimental platform. The microfluidics and agent-based modeling framework we present in this paper provides an effective means by which to connect experimental studies in microbiology to model development. The work demonstrates progress in coupling experimental results to simulation while also highlighting potential sources of discrepancies between real-world experiments and idealized models. PMID:29467721
Anatomically realistic multiscale models of normal and abnormal gastrointestinal electrical activity
Cheng, Leo K; Komuro, Rie; Austin, Travis M; Buist, Martin L; Pullan, Andrew J
2007-01-01
One of the major aims of the International Union of Physiological Sciences (IUPS) Physiome Project is to develop multiscale mathematical and computer models that can be used to help understand human health. We present here a small facet of this broad plan that applies to the gastrointestinal system. Specifically, we present an anatomically and physiologically based modelling framework that is capable of simulating normal and pathological electrical activity within the stomach and small intestine. The continuum models used within this framework have been created using anatomical information derived from common medical imaging modalities and data from the Visible Human Project. These models explicitly incorporate the various smooth muscle layers and networks of interstitial cells of Cajal (ICC) that are known to exist within the walls of the stomach and small bowel. Electrical activity within individual ICCs and smooth muscle cells is simulated using a previously published simplified representation of the cell level electrical activity. This simulated cell level activity is incorporated into a bidomain representation of the tissue, allowing electrical activity of the entire stomach or intestine to be simulated in the anatomically derived models. This electrical modelling framework successfully replicates many of the qualitative features of the slow wave activity within the stomach and intestine and has also been used to investigate activity associated with functional uncoupling of the stomach. PMID:17457969
An overview of the ENEA activities in the field of coupled codes NPP simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parisi, C.; Negrenti, E.; Sepielli, M.
2012-07-01
In the framework of the nuclear research activities in the fields of safety, training and education, ENEA (the Italian National Agency for New Technologies, Energy and the Sustainable Development) is in charge of defining and pursuing all the necessary steps for the development of a NPP engineering simulator at the 'Casaccia' Research Center near Rome. A summary of the activities in the field of the nuclear power plants simulation by coupled codes is here presented with the long term strategy for the engineering simulator development. Specifically, results from the participation in international benchmarking activities like the OECD/NEA 'Kalinin-3' benchmark andmore » the 'AER-DYN-002' benchmark, together with simulations of relevant events like the Fukushima accident, are here reported. The ultimate goal of such activities performed using state-of-the-art technology is the re-establishment of top level competencies in the NPP simulation field in order to facilitate the development of Enhanced Engineering Simulators and to upgrade competencies for supporting national energy strategy decisions, the nuclear national safety authority, and the R and D activities on NPP designs. (authors)« less
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2006-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis - Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2007-01-01
A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.
Documentation for the MODFLOW 6 framework
Hughes, Joseph D.; Langevin, Christian D.; Banta, Edward R.
2017-08-10
MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. Growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Often times, there are incompatibilities between these different MODFLOW versions. The report describes a new MODFLOW framework called MODFLOW 6 that is designed to support multiple models and multiple types of models. The framework is written in Fortran using a modular object-oriented design. The primary framework components include the simulation (or main program), Timing Module, Solutions, Models, Exchanges, and Utilities. The first version of the framework focuses on numerical solutions, numerical models, and numerical exchanges. This focus on numerical models allows multiple numerical models to be tightly coupled at the matrix level.
A computational fluid dynamics simulation framework for ventricular catheter design optimization.
Weisenberg, Sofy H; TerMaath, Stephanie C; Barbier, Charlotte N; Hill, Judith C; Killeffer, James A
2017-11-10
OBJECTIVE Cerebrospinal fluid (CSF) shunts are the primary treatment for patients suffering from hydrocephalus. While proven effective in symptom relief, these shunt systems are plagued by high failure rates and often require repeated revision surgeries to replace malfunctioning components. One of the leading causes of CSF shunt failure is obstruction of the ventricular catheter by aggregations of cells, proteins, blood clots, or fronds of choroid plexus that occlude the catheter's small inlet holes or even the full internal catheter lumen. Such obstructions can disrupt CSF diversion out of the ventricular system or impede it entirely. Previous studies have suggested that altering the catheter's fluid dynamics may help to reduce the likelihood of complete ventricular catheter failure caused by obstruction. However, systematic correlation between a ventricular catheter's design parameters and its performance, specifically its likelihood to become occluded, still remains unknown. Therefore, an automated, open-source computational fluid dynamics (CFD) simulation framework was developed for use in the medical community to determine optimized ventricular catheter designs and to rapidly explore parameter influence for a given flow objective. METHODS The computational framework was developed by coupling a 3D CFD solver and an iterative optimization algorithm and was implemented in a high-performance computing environment. The capabilities of the framework were demonstrated by computing an optimized ventricular catheter design that provides uniform flow rates through the catheter's inlet holes, a common design objective in the literature. The baseline computational model was validated using 3D nuclear imaging to provide flow velocities at the inlet holes and through the catheter. RESULTS The optimized catheter design achieved through use of the automated simulation framework improved significantly on previous attempts to reach a uniform inlet flow rate distribution using the standard catheter hole configuration as a baseline. While the standard ventricular catheter design featuring uniform inlet hole diameters and hole spacing has a standard deviation of 14.27% for the inlet flow rates, the optimized design has a standard deviation of 0.30%. CONCLUSIONS This customizable framework, paired with high-performance computing, provides a rapid method of design testing to solve complex flow problems. While a relatively simplified ventricular catheter model was used to demonstrate the framework, the computational approach is applicable to any baseline catheter model, and it is easily adapted to optimize catheters for the unique needs of different patients as well as for other fluid-based medical devices.
Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...
NASA Astrophysics Data System (ADS)
Kalmykov, N. N.; Ostapchenko, S. S.; Werner, K.
An extensive air shower (EAS) calculation scheme based on cascade equations and some EAS characteristics for energies 1014 -1017 eV are presented. The universal hadronic interaction model NEXUS is employed to provide the necessary data concerning hadron-air collisions. The influence of model assumptions on the longitudinal EAS development is discussed in the framework of the NEXUS and QGSJET models. Applied to EAS simulations, perspectives of combined Monte Carlo and numerical methods are considered.
Selected Topics in Overset Technology Development and Applications At NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Chan, William M.; Kwak, Dochan (Technical Monitor)
2002-01-01
This paper presents a general overview of overset technology development and applications at NASA Ames Research Center. The topics include: 1) Overview of overset activities at NASA Ames; 2) Recent developments in Chimera Grid Tools; 3) A general framework for multiple component dynamics; 4) A general script module for automating liquid rocket sub-systems simulations; and 5) Critical future work.
Going DEEP: guidelines for building simulation-based team assessments.
Grand, James A; Pearce, Marina; Rench, Tara A; Chao, Georgia T; Fernandez, Rosemarie; Kozlowski, Steve W J
2013-05-01
Whether for team training, research or evaluation, making effective use of simulation-based technologies requires robust, reliable and accurate assessment tools. Extant literature on simulation-based assessment practices has primarily focused on scenario and instructional design; however, relatively little direct guidance has been provided regarding the challenging decisions and fundamental principles related to assessment development and implementation. The objective of this manuscript is to introduce a generalisable assessment framework supplemented by specific guidance on how to construct and ensure valid and reliable simulation-based team assessment tools. The recommendations reflect best practices in assessment and are designed to empower healthcare educators, professionals and researchers with the knowledge to design and employ valid and reliable simulation-based team assessments. Information and actionable recommendations associated with creating assessments of team processes (non-technical 'teamwork' activities) and performance (demonstration of technical proficiency) are presented which provide direct guidance on how to Distinguish the underlying competencies one aims to assess, Elaborate the measures used to capture team member behaviours during simulation activities, Establish the content validity of these measures and Proceduralise the measurement tools in a way that is systematically aligned with the goals of the simulation activity while maintaining methodological rigour (DEEP). The DEEP framework targets fundamental principles and critical activities that are important for effective assessment, and should benefit healthcare educators, professionals and researchers seeking to design or enhance any simulation-based assessment effort.
DOT National Transportation Integrated Search
2012-05-01
This report describes the modeling, calibration, and validation of a VISSIM traffic-flow simulation of the San Jos, California, downtown network and examines various evacuation scenarios and first-responder routings to assess strategies that would ...
Developing Economic Literacy: A Challenge for Business Education.
ERIC Educational Resources Information Center
Ristau, Robert A.
1985-01-01
This article describes the framework and the methodology necessary to instill sound principles of economic understanding in business education students. Basic economic concepts are listed and discussed, as well as effective educational delivery systems such as games and simulations (examples are included). (CT)
Hierarchical Decentralized Control Strategy for Demand-Side Primary Frequency Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lian, Jianming; Hansen, Jacob; Marinovici, Laurentiu D.
The Grid Friendlymore » $$^\\textrm{TM}$$ Appliance~(GFA) controller, developed at Pacific Northwest National Laboratory, was designed for the purpose of autonomously switching off appliances by detecting under-frequency events. In this paper, a new frequency responsive load~(FRL) controller is first proposed by extending the functionality of the original GFA controller. The proposed FRL controller can autonomously switch on (or off) end-use loads by detecting over-frequency (or under-frequency) events through local frequency measurement. Then, a hierarchical decentralized control framework is developed for engaging the end-use loads to provide primary frequency response with the proposed FRL controller. The developed framework has several important features that are desirable in terms of providing primary frequency control. It not only exclusively maintains the autonomous operation of the end-use loads, but also effectively overcomes the stability issue associated with high penetration of FRLs. The simulation results illustrate the effectiveness of the developed hierarchical control framework for providing primary frequency response with the proposed FRL controller.« less
Simulating motivated cognition
NASA Technical Reports Server (NTRS)
Gevarter, William B.
1991-01-01
A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.
van den Beukel, Arie P; van der Voort, Mascha C
2017-03-01
The introduction of partially automated driving systems changes the driving task into supervising the automation with an occasional need to intervene. To develop interface solutions that adequately support drivers in this new role, this study proposes and evaluates an assessment framework that allows designers to evaluate driver-support within relevant real-world scenarios. Aspects identified as requiring assessment in terms of driver-support within the proposed framework are Accident Avoidance, gained Situation Awareness (SA) and Concept Acceptance. Measurement techniques selected to operationalise these aspects and the associated framework are pilot-tested with twenty-four participants in a driving simulator experiment. The objective of the test is to determine the reliability of the applied measurements for the assessment of the framework and whether the proposed framework is effective in predicting the level of support offered by the concepts. Based on the congruency between measurement scores produced in the test and scores with predefined differences in concept-support, this study demonstrates the framework's reliability. A remaining concern is the framework's weak sensitivity to small differences in offered support. The article concludes that applying the framework is especially advantageous for evaluating early design phases and can successfully contribute to the efficient development of driver's in-control and safe means of operating partially automated vehicles. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Haakensen, Erik Edward
1998-01-01
The desire for low-cost reliable computing is increasing. Most current fault tolerant computing solutions are not very flexible, i.e., they cannot adapt to reliability requirements of newly emerging applications in business, commerce, and manufacturing. It is important that users have a flexible, reliable platform to support both critical and noncritical applications. Chameleon, under development at the Center for Reliable and High-Performance Computing at the University of Illinois, is a software framework. for supporting cost-effective adaptable networked fault tolerant service. This thesis details a simulation of fault injection, detection, and recovery in Chameleon. The simulation was written in C++ using the DEPEND simulation library. The results obtained from the simulation included the amount of overhead incurred by the fault detection and recovery mechanisms supported by Chameleon. In addition, information about fault scenarios from which Chameleon cannot recover was gained. The results of the simulation showed that both critical and noncritical applications can be executed in the Chameleon environment with a fairly small amount of overhead. No single point of failure from which Chameleon could not recover was found. Chameleon was also found to be capable of recovering from several multiple failure scenarios.
NASA Astrophysics Data System (ADS)
Develaki, Maria
2017-11-01
Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.
Representing the work of medical protocols for organizational simulation.
Fridsma, D. B.
1998-01-01
Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231
Gathering Validity Evidence for Surgical Simulation: A Systematic Review.
Borgersen, Nanna Jo; Naur, Therese M H; Sørensen, Stine M D; Bjerrum, Flemming; Konge, Lars; Subhi, Yousif; Thomsen, Ann Sofia S
2018-06-01
To identify current trends in the use of validity frameworks in surgical simulation, to provide an overview of the evidence behind the assessment of technical skills in all surgical specialties, and to present recommendations and guidelines for future validity studies. Validity evidence for assessment tools used in the evaluation of surgical performance is of paramount importance to ensure valid and reliable assessment of skills. We systematically reviewed the literature by searching 5 databases (PubMed, EMBASE, Web of Science, PsycINFO, and the Cochrane Library) for studies published from January 1, 2008, to July 10, 2017. We included original studies evaluating simulation-based assessments of health professionals in surgical specialties and extracted data on surgical specialty, simulator modality, participant characteristics, and the validity framework used. Data were synthesized qualitatively. We identified 498 studies with a total of 18,312 participants. Publications involving validity assessments in surgical simulation more than doubled from 2008 to 2010 (∼30 studies/year) to 2014 to 2016 (∼70 to 90 studies/year). Only 6.6% of the studies used the recommended contemporary validity framework (Messick). The majority of studies used outdated frameworks such as face validity. Significant differences were identified across surgical specialties. The evaluated assessment tools were mostly inanimate or virtual reality simulation models. An increasing number of studies have gathered validity evidence for simulation-based assessments in surgical specialties, but the use of outdated frameworks remains common. To address the current practice, this paper presents guidelines on how to use the contemporary validity framework when designing validity studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less
Uncertainty analyses of CO2 plume expansion subsequent to wellbore CO2 leakage into aquifers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou, Zhangshuan; Bacon, Diana H.; Engel, David W.
2014-08-01
In this study, we apply an uncertainty quantification (UQ) framework to CO2 sequestration problems. In one scenario, we look at the risk of wellbore leakage of CO2 into a shallow unconfined aquifer in an urban area; in another scenario, we study the effects of reservoir heterogeneity on CO2 migration. We combine various sampling approaches (quasi-Monte Carlo, probabilistic collocation, and adaptive sampling) in order to reduce the number of forward calculations while trying to fully explore the input parameter space and quantify the input uncertainty. The CO2 migration is simulated using the PNNL-developed simulator STOMP-CO2e (the water-salt-CO2 module). For computationally demandingmore » simulations with 3D heterogeneity fields, we combined the framework with a scalable version module, eSTOMP, as the forward modeling simulator. We built response curves and response surfaces of model outputs with respect to input parameters, to look at the individual and combined effects, and identify and rank the significance of the input parameters.« less
NASA Astrophysics Data System (ADS)
Freniere, Cole; Pathak, Ashish; Raessi, Mehdi
2016-11-01
Ocean Wave Energy Converters (WECs) are devices that convert energy from ocean waves into electricity. To aid in the design of WECs, an advanced computational framework has been developed which has advantages over conventional methods. The computational framework simulates the performance of WECs in a virtual wave tank by solving the full Navier-Stokes equations in 3D, capturing the fluid-structure interaction, nonlinear and viscous effects. In this work, we present simulations of the performance of pitching cylinder-type WECs and compare against experimental data. WECs are simulated at both model and full scales. The results are used to determine the role of the Keulegan-Carpenter (KC) number. The KC number is representative of viscous drag behavior on a bluff body in an oscillating flow, and is considered an important indicator of the dynamics of a WEC. Studying the effects of the KC number is important for determining the validity of the Froude scaling and the inviscid potential flow theory, which are heavily relied on in the conventional approaches to modeling WECs. Support from the National Science Foundation is gratefully acknowledged.
Interaction of light with hematite hierarchical structures: Experiments and simulations
NASA Astrophysics Data System (ADS)
Distaso, Monica; Zhuromskyy, Oleksander; Seemann, Benjamin; Pflug, Lukas; Mačković, Mirza; Encina, Ezequiel; Taylor, Robin Klupp; Müller, Rolf; Leugering, Günter; Spiecker, Erdmann; Peschel, Ulf; Peukert, Wolfgang
2017-03-01
Mesocrystalline particles have been recognized as a class of multifunctional materials with potential applications in different fields. However, the internal organization of nanocomposite mesocrystals and its influence on the final properties have not yet been investigated. In this paper, a novel strategy based on electrodynamic simulations is developed to shed light on how the internal structure of mesocrystals influences their optical properties. In a first instance, a unified design protocol is reported for the fabrication of hematite/PVP particles with different morphologies such as pseudo-cubes, rods-like and apple-like structures and controlled particle size distributions. The optical properties of hematite/PVP mesocrystals are effectively simulated by taking their aggregate and nanocomposite structure into consideration. The superposition T-Matrix approach accounts for the aggregate nature of mesocrystalline particles and validate the effective medium approximation used in the framework of the Mie theory and electromagnetic simulation such as Finite Element Method. The approach described in our paper provides the framework to understand and predict the optical properties of mesocrystals and more general, of hierarchical nanostructured particles.
Locating Anomalies in Complex Data Sets Using Visualization and Simulation
NASA Technical Reports Server (NTRS)
Panetta, Karen
2001-01-01
The research goals are to create a simulation framework that can accept any combination of models written at the gate or behavioral level. The framework provides the ability to fault simulate and create scenarios of experiments using concurrent simulation. In order to meet these goals we have had to fulfill the following requirements. The ability to accept models written in VHDL, Verilog or the C languages. The ability to propagate faults through any model type. The ability to create experiment scenarios efficiently without generating every possible combination of variables. The ability to accept adversity of fault models beyond the single stuck-at model. Major development has been done to develop a parser that can accept models written in various languages. This work has generated considerable attention from other universities and industry for its flexibility and usefulness. The parser uses LEXX and YACC to parse Verilog and C. We have also utilized our industrial partnership with Alternative System's Inc. to import vhdl into our simulator. For multilevel simulation, we needed to modify the simulator architecture to accept models that contained multiple outputs. This enabled us to accept behavioral components. The next major accomplishment was the addition of "functional fault models". Functional fault models change the behavior of a gate or model. For example, a bridging fault can make an OR gate behave like an AND gate. This has applications beyond fault simulation. This modeling flexibility will make the simulator more useful for doing verification and model comparison. For instance, two or more versions of an ALU can be comparatively simulated in a single execution. The results will show where and how the models differed so that the performance and correctness of the models may be evaluated. A considerable amount of time has been dedicated to validating the simulator performance on larger models provided by industry and other universities.
"No-Go Considerations" for In Situ Simulation Safety.
Bajaj, Komal; Minors, Anjoinette; Walker, Katie; Meguerdichian, Michael; Patterson, Mary
2018-06-01
In situ simulation is the practice of simulation in the actual clinical environment and has demonstrated utility in the assessment of system processes, identification of latent safety threats, and improvement in teamwork and communication. Nonetheless, performing simulated events in a real patient care setting poses potential risks to patient and staff safety. One integral aspect of a comprehensive approach to ensure the safety of in situ simulation includes the identification and establishment of "no-go considerations," that is, key decision-making considerations under which in situ simulations should be canceled, postponed, moved to another area, or rescheduled. These considerations should be modified and adjusted to specific clinical units. This article provides a framework of key essentials in developing no-go considerations.
A Probabilistic Framework for the Validation and Certification of Computer Simulations
NASA Technical Reports Server (NTRS)
Ghanem, Roger; Knio, Omar
2000-01-01
The paper presents a methodology for quantifying, propagating, and managing the uncertainty in the data required to initialize computer simulations of complex phenomena. The purpose of the methodology is to permit the quantitative assessment of a certification level to be associated with the predictions from the simulations, as well as the design of a data acquisition strategy to achieve a target level of certification. The value of a methodology that can address the above issues is obvious, specially in light of the trend in the availability of computational resources, as well as the trend in sensor technology. These two trends make it possible to probe physical phenomena both with physical sensors, as well as with complex models, at previously inconceivable levels. With these new abilities arises the need to develop the knowledge to integrate the information from sensors and computer simulations. This is achieved in the present work by tracing both activities back to a level of abstraction that highlights their commonalities, thus allowing them to be manipulated in a mathematically consistent fashion. In particular, the mathematical theory underlying computer simulations has long been associated with partial differential equations and functional analysis concepts such as Hilbert spares and orthogonal projections. By relying on a probabilistic framework for the modeling of data, a Hilbert space framework emerges that permits the modeling of coefficients in the governing equations as random variables, or equivalently, as elements in a Hilbert space. This permits the development of an approximation theory for probabilistic problems that parallels that of deterministic approximation theory. According to this formalism, the solution of the problem is identified by its projection on a basis in the Hilbert space of random variables, as opposed to more traditional techniques where the solution is approximated by its first or second-order statistics. The present representation, in addition to capturing significantly more information than the traditional approach, facilitates the linkage between different interacting stochastic systems as is typically observed in real-life situations.
Undermining and Strengthening Social Networks through Network Modification
Mellon, Jonathan; Yoder, Jordan; Evans, Daniel
2016-01-01
Social networks have well documented effects at the individual and aggregate level. Consequently it is often useful to understand how an attempt to influence a network will change its structure and consequently achieve other goals. We develop a framework for network modification that allows for arbitrary objective functions, types of modification (e.g. edge weight addition, edge weight removal, node removal, and covariate value change), and recovery mechanisms (i.e. how a network responds to interventions). The framework outlined in this paper helps both to situate the existing work on network interventions but also opens up many new possibilities for intervening in networks. In particular use two case studies to highlight the potential impact of empirically calibrating the objective function and network recovery mechanisms as well as showing how interventions beyond node removal can be optimised. First, we simulate an optimal removal of nodes from the Noordin terrorist network in order to reduce the expected number of attacks (based on empirically predicting the terrorist collaboration network from multiple types of network ties). Second, we simulate optimally strengthening ties within entrepreneurial ecosystems in six developing countries. In both cases we estimate ERGM models to simulate how a network will endogenously evolve after intervention. PMID:27703198
Undermining and Strengthening Social Networks through Network Modification.
Mellon, Jonathan; Yoder, Jordan; Evans, Daniel
2016-10-05
Social networks have well documented effects at the individual and aggregate level. Consequently it is often useful to understand how an attempt to influence a network will change its structure and consequently achieve other goals. We develop a framework for network modification that allows for arbitrary objective functions, types of modification (e.g. edge weight addition, edge weight removal, node removal, and covariate value change), and recovery mechanisms (i.e. how a network responds to interventions). The framework outlined in this paper helps both to situate the existing work on network interventions but also opens up many new possibilities for intervening in networks. In particular use two case studies to highlight the potential impact of empirically calibrating the objective function and network recovery mechanisms as well as showing how interventions beyond node removal can be optimised. First, we simulate an optimal removal of nodes from the Noordin terrorist network in order to reduce the expected number of attacks (based on empirically predicting the terrorist collaboration network from multiple types of network ties). Second, we simulate optimally strengthening ties within entrepreneurial ecosystems in six developing countries. In both cases we estimate ERGM models to simulate how a network will endogenously evolve after intervention.
Undermining and Strengthening Social Networks through Network Modification
NASA Astrophysics Data System (ADS)
Mellon, Jonathan; Yoder, Jordan; Evans, Daniel
2016-10-01
Social networks have well documented effects at the individual and aggregate level. Consequently it is often useful to understand how an attempt to influence a network will change its structure and consequently achieve other goals. We develop a framework for network modification that allows for arbitrary objective functions, types of modification (e.g. edge weight addition, edge weight removal, node removal, and covariate value change), and recovery mechanisms (i.e. how a network responds to interventions). The framework outlined in this paper helps both to situate the existing work on network interventions but also opens up many new possibilities for intervening in networks. In particular use two case studies to highlight the potential impact of empirically calibrating the objective function and network recovery mechanisms as well as showing how interventions beyond node removal can be optimised. First, we simulate an optimal removal of nodes from the Noordin terrorist network in order to reduce the expected number of attacks (based on empirically predicting the terrorist collaboration network from multiple types of network ties). Second, we simulate optimally strengthening ties within entrepreneurial ecosystems in six developing countries. In both cases we estimate ERGM models to simulate how a network will endogenously evolve after intervention.