Scott, J; Botsis, T; Ball, R
2014-01-01
Spontaneous Reporting Systems [SRS] are critical tools in the post-licensure evaluation of medical product safety. Regulatory authorities use a variety of data mining techniques to detect potential safety signals in SRS databases. Assessing the performance of such signal detection procedures requires simulated SRS databases, but simulation strategies proposed to date each have limitations. We sought to develop a novel SRS simulation strategy based on plausible mechanisms for the growth of databases over time. We developed a simulation strategy based on the network principle of preferential attachment. We demonstrated how this strategy can be used to create simulations based on specific databases of interest, and provided an example of using such simulations to compare signal detection thresholds for a popular data mining algorithm. The preferential attachment simulations were generally structurally similar to our targeted SRS database, although they had fewer nodes of very high degree. The approach was able to generate signal-free SRS simulations, as well as mimicking specific known true signals. Explorations of different reporting thresholds for the FDA Vaccine Adverse Event Reporting System suggested that using proportional reporting ratio [PRR] > 3.0 may yield better signal detection operating characteristics than the more commonly used PRR > 2.0 threshold. The network analytic approach to SRS simulation based on the principle of preferential attachment provides an attractive framework for exploring the performance of safety signal detection algorithms. This approach is potentially more principled and versatile than existing simulation approaches. The utility of network-based SRS simulations needs to be further explored by evaluating other types of simulated signals with a broader range of data mining approaches, and comparing network-based simulations with other simulation strategies where applicable.
Computer-based simulation training in emergency medicine designed in the light of malpractice cases.
Karakuş, Akan; Duran, Latif; Yavuz, Yücel; Altintop, Levent; Calişkan, Fatih
2014-07-27
Using computer-based simulation systems in medical education is becoming more and more common. Although the benefits of practicing with these systems in medical education have been demonstrated, advantages of using computer-based simulation in emergency medicine education are less validated. The aim of the present study was to assess the success rates of final year medical students in doing emergency medical treatment and evaluating the effectiveness of computer-based simulation training in improving final year medical students' knowledge. Twenty four Students trained with computer-based simulation and completed at least 4 hours of simulation-based education between the dates Feb 1, 2010 - May 1, 2010. Also a control group (traditionally trained, n =24) was chosen. After the end of training, students completed an examination about 5 randomized medical simulation cases. In 5 cases, an average of 3.9 correct medical approaches carried out by computer-based simulation trained students, an average of 2.8 correct medical approaches carried out by traditionally trained group (t = 3.90, p < 0.005). We found that the success of students trained with simulation training in cases which required complicated medical approach, was statistically higher than the ones who didn't take simulation training (p ≤ 0.05). Computer-based simulation training would be significantly effective in learning of medical treatment algorithms. We thought that these programs can improve the success rate of students especially in doing adequate medical approach to complex emergency cases.
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.
2012-01-01
An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.
Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G
2012-01-01
Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.
Multiple point statistical simulation using uncertain (soft) conditional data
NASA Astrophysics Data System (ADS)
Hansen, Thomas Mejer; Vu, Le Thanh; Mosegaard, Klaus; Cordua, Knud Skou
2018-05-01
Geostatistical simulation methods have been used to quantify spatial variability of reservoir models since the 80s. In the last two decades, state of the art simulation methods have changed from being based on covariance-based 2-point statistics to multiple-point statistics (MPS), that allow simulation of more realistic Earth-structures. In addition, increasing amounts of geo-information (geophysical, geological, etc.) from multiple sources are being collected. This pose the problem of integration of these different sources of information, such that decisions related to reservoir models can be taken on an as informed base as possible. In principle, though difficult in practice, this can be achieved using computationally expensive Monte Carlo methods. Here we investigate the use of sequential simulation based MPS simulation methods conditional to uncertain (soft) data, as a computational efficient alternative. First, it is demonstrated that current implementations of sequential simulation based on MPS (e.g. SNESIM, ENESIM and Direct Sampling) do not account properly for uncertain conditional information, due to a combination of using only co-located information, and a random simulation path. Then, we suggest two approaches that better account for the available uncertain information. The first make use of a preferential simulation path, where more informed model parameters are visited preferentially to less informed ones. The second approach involves using non co-located uncertain information. For different types of available data, these approaches are demonstrated to produce simulation results similar to those obtained by the general Monte Carlo based approach. These methods allow MPS simulation to condition properly to uncertain (soft) data, and hence provides a computationally attractive approach for integration of information about a reservoir model.
Physics-based approach to haptic display
NASA Technical Reports Server (NTRS)
Brown, J. Michael; Colgate, J. Edward
1994-01-01
This paper addresses the implementation of complex multiple degree of freedom virtual environments for haptic display. We suggest that a physics based approach to rigid body simulation is appropriate for hand tool simulation, but that currently available simulation techniques are not sufficient to guarantee successful implementation. We discuss the desirable features of a virtual environment simulation, specifically highlighting the importance of stability guarantees.
Shorov, Andrey; Kotenko, Igor
2014-01-01
The paper outlines a bioinspired approach named "network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed procedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine necessary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described.
Scalability of grid- and subbasin-based land surface modeling approaches for hydrologic simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tesfa, Teklu K.; Ruby Leung, L.; Huang, Maoyi
2014-03-27
This paper investigates the relative merits of grid- and subbasin-based land surface modeling approaches for hydrologic simulations, with a focus on their scalability (i.e., abilities to perform consistently across a range of spatial resolutions) in simulating runoff generation. Simulations produced by the grid- and subbasin-based configurations of the Community Land Model (CLM) are compared at four spatial resolutions (0.125o, 0.25o, 0.5o and 1o) over the topographically diverse region of the U.S. Pacific Northwest. Using the 0.125o resolution simulation as the “reference”, statistical skill metrics are calculated and compared across simulations at 0.25o, 0.5o and 1o spatial resolutions of each modelingmore » approach at basin and topographic region levels. Results suggest significant scalability advantage for the subbasin-based approach compared to the grid-based approach for runoff generation. Basin level annual average relative errors of surface runoff at 0.25o, 0.5o, and 1o compared to 0.125o are 3%, 4%, and 6% for the subbasin-based configuration and 4%, 7%, and 11% for the grid-based configuration, respectively. The scalability advantages of the subbasin-based approach are more pronounced during winter/spring and over mountainous regions. The source of runoff scalability is found to be related to the scalability of major meteorological and land surface parameters of runoff generation. More specifically, the subbasin-based approach is more consistent across spatial scales than the grid-based approach in snowfall/rainfall partitioning, which is related to air temperature and surface elevation. Scalability of a topographic parameter used in the runoff parameterization also contributes to improved scalability of the rain driven saturated surface runoff component, particularly during winter. Hence this study demonstrates the importance of spatial structure for multi-scale modeling of hydrological processes, with implications to surface heat fluxes in coupled land-atmosphere modeling.« less
Bittig, Arne T; Uhrmacher, Adelinde M
2017-01-01
Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.
Simulation-Based Constructivist Approach for Education Leaders
ERIC Educational Resources Information Center
Shapira-Lishchinsky, Orly
2015-01-01
The purpose of this study was to reflect the leadership strategies that may arise using a constructivist approach based on organizational learning. This approach involved the use of simulations that focused on ethical tensions in school principals' daily experiences, and the development of codes of ethical conduct to reduce these tensions. The…
A fast method for optical simulation of flood maps of light-sharing detector modules.
Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W; Peng, Qiyu
2015-12-01
Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200-600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.
Kotenko, Igor
2014-01-01
The paper outlines a bioinspired approach named “network nervous system" and methods of simulation of infrastructure attacks and protection mechanisms based on this approach. The protection mechanisms based on this approach consist of distributed prosedures of information collection and processing, which coordinate the activities of the main devices of a computer network, identify attacks, and determine nessesary countermeasures. Attacks and protection mechanisms are specified as structural models using a set-theoretic approach. An environment for simulation of protection mechanisms based on the biological metaphor is considered; the experiments demonstrating the effectiveness of the protection mechanisms are described. PMID:25254229
NASA Astrophysics Data System (ADS)
Qi, Chenkun; Zhao, Xianchao; Gao, Feng; Ren, Anye; Hu, Yan
2016-11-01
The hardware-in-the-loop (HIL) contact simulation for flying objects in space is challenging due to the divergence caused by the time delay. In this study, a divergence compensation approach is proposed for the stiffness-varying discrete contact. The dynamic response delay of the motion simulator and the force measurement delay are considered. For the force measurement delay, a phase lead based force compensation approach is used. For the dynamic response delay of the motion simulator, a response error based force compensation approach is used, where the compensation force is obtained from the real-time identified contact stiffness and real-time measured position response error. The dynamic response model of the motion simulator is not required. The simulations and experiments show that the simulation divergence can be compensated effectively and satisfactorily by using the proposed approach.
Panchal, Mitesh B; Upadhyay, Sanjay H
2014-09-01
In this study, the feasibility of single walled boron nitride nanotube (SWBNNT)-based biosensors has been ensured considering the continuum modelling-based simulation approach, for mass-based detection of various bacterium/viruses. Various types of bacterium or viruses have been taken into consideration at the free-end of the cantilevered configuration of the SWBNNT, as a biosensor. Resonant frequency shift-based analysis has been performed with the adsorption of various bacterium/viruses considered as additional mass to the SWBNNT-based sensor system. The continuum mechanics-based analytical approach, considering effective wall thickness has been considered to validate the finite element method (FEM)-based simulation results, based on continuum volume-based modelling of the SWBNNT. As a systematic analysis approach, the FEM-based simulation results are found in excellent agreement with the analytical results, to analyse the SWBNNTs for their wide range of applications such as nanoresonators, biosensors, gas-sensors, transducers and so on. The obtained results suggest that by using the SWBNNT of smaller size the sensitivity of the sensor system can be enhanced and detection of the bacterium/virus having mass of 4.28 × 10⁻²⁴ kg can be effectively performed.
Effect of different runway size on pilot performance during simulated night landing approaches.
DOT National Transportation Integrated Search
1981-02-01
In Experiment I, three pilots flew simulated approaches and landings in a fixed-base simulator with a computer-generated-image visual display. Practice approaches were flown with an 8,000-ft-long runway that was either 75, 150, or 300 ft wide; test a...
A fast method for optical simulation of flood maps of light-sharing detector modules
Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W.; Peng, Qiyu
2016-01-01
Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials. PMID:27660376
Matthew P. Thompson; Julie W. Gilbertson-Day; Joe H. Scott
2015-01-01
We develop a novel risk assessment approach that integrates complementary, yet distinct, spatial modeling approaches currently used in wildfire risk assessment. Motivation for this work stems largely from limitations of existing stochastic wildfire simulation systems, which can generate pixel-based outputs of fire behavior as well as polygon-based outputs of simulated...
NASA Technical Reports Server (NTRS)
Mckeown, W. L.
1984-01-01
A simulation experiment to explore the use of an augmented pictorial display to approach and land a helicopter in zero visibility conditions was conducted in a fixed base simulator. A literature search was also conducted to determine related work. A display was developed and pilot in-the-loop evaluations were conducted. The pictorial display was a simulated, high resolution radar image, augmented with various parameters to improve distance and motion cues. Approaches and landings were accomplished, but with higher workloads and less accuracy than necessary for a practical system. Recommendations are provided for display improvements and a follow on simulation study in a moving based simulator.
NASA Technical Reports Server (NTRS)
Grantham, William D.; Williams, Robert H.
1987-01-01
For the case of an approach-and-landing piloting task emphasizing response to the landing flare, pilot opinion and performance parameters derived from jet transport aircraft six-degree-of-freedom ground-based and in-flight simulators were compared in order to derive data for the flight-controls/flying-qualities engineers. The data thus obtained indicate that ground simulation results tend to be conservative, and that the effect of control sensitivity is more pronounced for ground simulation. The pilot also has a greater tendency to generate pilot-induced oscillation in ground-based simulation than in flight.
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Nguyen, L. T.; Patton, J. M., Jr.; Deal, P. L.; Champine, R. A.; Carter, C. R.
1972-01-01
A fixed-base simulator study was conducted to determine the flight characteristics of a representative STOL transport having a high wing and equipped with an external-flow jet flap in combination with four high-bypass-ratio fan-jet engines during the approach and landing. Real-time digital simulation techniques were used. The computer was programed with equations of motion for six degrees of freedom and the aerodynamic inputs were based on measured wind-tunnel data. A visual display of a STOL airport was provided for simulation of the flare and touchdown characteristics. The primary piloting task was an instrument approach to a breakout at a 200-ft ceiling with a visual landing.
Effect of inlet modelling on surface drainage in coupled urban flood simulation
NASA Astrophysics Data System (ADS)
Jang, Jiun-Huei; Chang, Tien-Hao; Chen, Wei-Bo
2018-07-01
For a highly developed urban area with complete drainage systems, flood simulation is necessary for describing the flow dynamics from rainfall, to surface runoff, and to sewer flow. In this study, a coupled flood model based on diffusion wave equations was proposed to simulate one-dimensional sewer flow and two-dimensional overland flow simultaneously. The overland flow model provides details on the rainfall-runoff process to estimate the excess runoff that enters the sewer system through street inlets for sewer flow routing. Three types of inlet modelling are considered in this study, including the manhole-based approach that ignores the street inlets by draining surface water directly into manholes, the inlet-manhole approach that drains surface water into manholes that are each connected to multiple inlets, and the inlet-node approach that drains surface water into sewer nodes that are connected to individual inlets. The simulation results were compared with a high-intensity rainstorm event that occurred in 2015 in Taipei City. In the verification of the maximum flood extent, the two approaches that considered street inlets performed considerably better than that without street inlets. When considering the aforementioned models in terms of temporal flood variation, using manholes as receivers leads to an overall inefficient draining of the surface water either by the manhole-based approach or by the inlet-manhole approach. Using the inlet-node approach is more reasonable than using the inlet-manhole approach because the inlet-node approach greatly reduces the fluctuation of the sewer water level. The inlet-node approach is more efficient in draining surface water by reducing flood volume by 13% compared with the inlet-manhole approach and by 41% compared with the manhole-based approach. The results show that inlet modeling has a strong influence on drainage efficiency in coupled flood simulation.
NASA Technical Reports Server (NTRS)
Shackelford, John H.; Saugen, John D.; Wurst, Michael J.; Adler, James
1991-01-01
A generic planar 3 degree of freedom simulation was developed that supports hardware in the loop simulations, guidance and control analysis, and can directly generate flight software. This simulation was developed in a small amount of time utilizing rapid prototyping techniques. The approach taken to develop this simulation tool, the benefits seen using this approach to development, and on-going efforts to improve and extend this capability are described. The simulation is composed of 3 major elements: (1) Docker dynamics model, (2) Dockee dynamics model, and (3) Docker Control System. The docker and dockee models are based on simple planar orbital dynamics equations using a spherical earth gravity model. The docker control system is based on a phase plane approach to error correction.
Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems
Stover, Lori J.; Nair, Niketh S.; Faeder, James R.
2014-01-01
Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This “network-free” approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of “partial network expansion” into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility. PMID:24699269
Exact hybrid particle/population simulation of rule-based models of biochemical systems.
Hogg, Justin S; Harris, Leonard A; Stover, Lori J; Nair, Niketh S; Faeder, James R
2014-04-01
Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility.
Quantum Fragment Based ab Initio Molecular Dynamics for Proteins.
Liu, Jinfeng; Zhu, Tong; Wang, Xianwei; He, Xiao; Zhang, John Z H
2015-12-08
Developing ab initio molecular dynamics (AIMD) methods for practical application in protein dynamics is of significant interest. Due to the large size of biomolecules, applying standard quantum chemical methods to compute energies for dynamic simulation is computationally prohibitive. In this work, a fragment based ab initio molecular dynamics approach is presented for practical application in protein dynamics study. In this approach, the energy and forces of the protein are calculated by a recently developed electrostatically embedded generalized molecular fractionation with conjugate caps (EE-GMFCC) method. For simulation in explicit solvent, mechanical embedding is introduced to treat protein interaction with explicit water molecules. This AIMD approach has been applied to MD simulations of a small benchmark protein Trpcage (with 20 residues and 304 atoms) in both the gas phase and in solution. Comparison to the simulation result using the AMBER force field shows that the AIMD gives a more stable protein structure in the simulation, indicating that quantum chemical energy is more reliable. Importantly, the present fragment-based AIMD simulation captures quantum effects including electrostatic polarization and charge transfer that are missing in standard classical MD simulations. The current approach is linear-scaling, trivially parallel, and applicable to performing the AIMD simulation of proteins with a large size.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.
Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator
Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus
2017-01-01
Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation. PMID:28596730
GridLAB-D: An Agent-Based Simulation Framework for Smart Grids
Chassin, David P.; Fuller, Jason C.; Djilali, Ned
2014-01-01
Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less
GridLAB-D: An Agent-Based Simulation Framework for Smart Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chassin, David P.; Fuller, Jason C.; Djilali, Ned
2014-06-23
Simulation of smart grid technologies requires a fundamentally new approach to integrated modeling of power systems, energy markets, building technologies, and the plethora of other resources and assets that are becoming part of modern electricity production, delivery, and consumption systems. As a result, the US Department of Energy’s Office of Electricity commissioned the development of a new type of power system simulation tool called GridLAB-D that uses an agent-based approach to simulating smart grids. This paper presents the numerical methods and approach to time-series simulation used by GridLAB-D and reviews applications in power system studies, market design, building control systemmore » design, and integration of wind power in a smart grid.« less
NASA Astrophysics Data System (ADS)
Durand, Olivier; Soulard, Laurent; Jaouen, Stephane; Heuze, Olivier; Colombet, Laurent; Cieren, Emmanuel
2017-06-01
We compare, at similar scales, the processes of microjetting and ejecta production from shocked roughened metal surfaces by using atomistic and continuous approaches. The atomistic approach is based on very large scale molecular dynamics (MD) simulations. The continuous approach is based on Eulerian hydrodynamics simulations with adaptive mesh refinement; the simulations take into account the effects of viscosity and surface tension, and they use an equation of state calculated from the MD simulations. The microjetting is generated by shock-loading above its fusion point a three-dimensional tin crystal with an initial sinusoidal free surface perturbation, the crystal being set in contact with a vacuum. Several samples with homothetic wavelengths and amplitudes of defect are simulated in order to investigate the influence of the viscosity and surface tension of the metal. The simulations show that the hydrodynamic code reproduces with a very good agreement the distributions, calculated from the MD simulations, of the ejected mass and velocity along the jet. Both codes exhibit also a similar phenomenology of fragmentation of the metallic liquid sheets ejected.
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Deal, P. L.
1974-01-01
A fixed-base simulator study was conducted to determine the minimum acceptable level of longitudinal stability for a representative turbofan STOL (short take-off and landing) transport airplane during the landing approach. Real-time digital simulation techniques were used. The computer was programed with equations of motion for six degrees of freedom, and the aerodynamic inputs were based on measured wind-tunnel data. The primary piloting task was an instrument approach to a breakout at a 60-m (200-ft) ceiling.
Chung, Christopher A; Alfred, Michael
2009-06-01
Societal pressures, accreditation organizations, and licensing agencies are emphasizing the importance of ethics in the engineering curriculum. Traditionally, this subject has been taught using dogma, heuristics, and case study approaches. Most recently a number of organizations have sought to increase the utility of these approaches by utilizing the Internet. Resources from these organizations include on-line courses and tests, videos, and DVDs. While these individual approaches provide a foundation on which to base engineering ethics, they may be limited in developing a student's ability to identify, analyze, and respond to engineering ethics situations outside of the classroom environment. More effective approaches utilize a combination of these types of approaches. This paper describes the design and development of an internet based interactive Simulator for Engineering Ethics Education. The simulator places students in first person perspective scenarios involving different types of ethical situations. Students must gather data, assess the situation, and make decisions. This requires students to develop their own ability to identify and respond to ethical engineering situations. A limited comparison between the internet based interactive simulator and conventional internet web based instruction indicates a statistically significant improvement of 32% in instructional effectiveness. The simulator is currently being used at the University of Houston to help fulfill ABET requirements.
A fast method for optical simulation of flood maps of light-sharing detector modules
Shi, Han; Du, Dong; Xu, JianFeng; ...
2015-09-03
Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. Here, we present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We also simulated conventional block detector designs with different slotted light guide patterns using the new approachmore » and compared the outcomes with those from GATE simulations. And while the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.« less
HuPSON: the human physiology simulation ontology.
Gündel, Michaela; Younesi, Erfan; Malhotra, Ashutosh; Wang, Jiali; Li, Hui; Zhang, Bijun; de Bono, Bernard; Mevissen, Heinz-Theodor; Hofmann-Apitius, Martin
2013-11-22
Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios.The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain.
A Component-Based Extension Framework for Large-Scale Parallel Simulations in NEURON
King, James G.; Hines, Michael; Hill, Sean; Goodman, Philip H.; Markram, Henry; Schürmann, Felix
2008-01-01
As neuronal simulations approach larger scales with increasing levels of detail, the neurosimulator software represents only a part of a chain of tools ranging from setup, simulation, interaction with virtual environments to analysis and visualizations. Previously published approaches to abstracting simulator engines have not received wide-spread acceptance, which in part may be to the fact that they tried to address the challenge of solving the model specification problem. Here, we present an approach that uses a neurosimulator, in this case NEURON, to describe and instantiate the network model in the simulator's native model language but then replaces the main integration loop with its own. Existing parallel network models are easily adopted to run in the presented framework. The presented approach is thus an extension to NEURON but uses a component-based architecture to allow for replaceable spike exchange components and pluggable components for monitoring, analysis, or control that can run in this framework alongside with the simulation. PMID:19430597
Kittipittayakorn, Cholada; Ying, Kuo-Ching
2016-01-01
Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department.
Kittipittayakorn, Cholada
2016-01-01
Many hospitals are currently paying more attention to patient satisfaction since it is an important service quality index. Many Asian countries' healthcare systems have a mixed-type registration, accepting both walk-in patients and scheduled patients. This complex registration system causes a long patient waiting time in outpatient clinics. Different approaches have been proposed to reduce the waiting time. This study uses the integration of discrete event simulation (DES) and agent-based simulation (ABS) to improve patient waiting time and is the first attempt to apply this approach to solve this key problem faced by orthopedic departments. From the data collected, patient behaviors are modeled and incorporated into a massive agent-based simulation. The proposed approach is an aid for analyzing and modifying orthopedic department processes, allows us to consider far more details, and provides more reliable results. After applying the proposed approach, the total waiting time of the orthopedic department fell from 1246.39 minutes to 847.21 minutes. Thus, using the correct simulation model significantly reduces patient waiting time in an orthopedic department. PMID:27195606
Vibronic coupling simulations for linear and nonlinear optical processes: Simulation results
NASA Astrophysics Data System (ADS)
Silverstein, Daniel W.; Jensen, Lasse
2012-02-01
A vibronic coupling model based on time-dependent wavepacket approach is applied to simulate linear optical processes, such as one-photon absorbance and resonance Raman scattering, and nonlinear optical processes, such as two-photon absorbance and resonance hyper-Raman scattering, on a series of small molecules. Simulations employing both the long-range corrected approach in density functional theory and coupled cluster are compared and also examined based on available experimental data. Although many of the small molecules are prone to anharmonicity in their potential energy surfaces, the harmonic approach performs adequately. A detailed discussion of the non-Condon effects is illustrated by the molecules presented in this work. Linear and nonlinear Raman scattering simulations allow for the quantification of interference between the Franck-Condon and Herzberg-Teller terms for different molecules.
NASA Astrophysics Data System (ADS)
Durand, O.; Jaouen, S.; Soulard, L.; Heuzé, O.; Colombet, L.
2017-10-01
We compare, at similar scales, the processes of microjetting and ejecta production from shocked roughened metal surfaces by using atomistic and continuous approaches. The atomistic approach is based on very large scale molecular dynamics (MD) simulations with systems containing up to 700 × 106 atoms. The continuous approach is based on Eulerian hydrodynamics simulations with adaptive mesh refinement; the simulations take into account the effects of viscosity and surface tension, and the equation of state is calculated from the MD simulations. The microjetting is generated by shock-loading above its fusion point a three-dimensional tin crystal with an initial sinusoidal free surface perturbation, the crystal being set in contact with a vacuum. Several samples with homothetic wavelengths and amplitudes of defect are simulated in order to investigate the influence of viscosity and surface tension of the metal. The simulations show that the hydrodynamic code reproduces with very good agreement the profiles, calculated from the MD simulations, of the ejected mass and velocity along the jet. Both codes also exhibit a similar fragmentation phenomenology of the metallic liquid sheets ejected, although the fragmentation seed is different. We show in particular, that it depends on the mesh size in the continuous approach.
ERIC Educational Resources Information Center
Zhang, Jianwei; Chen, Qi; Sun, Yanquing; Reid, David J.
2004-01-01
Learning support studies involving simulation-based scientific discovery learning have tended to adopt an ad hoc strategies-oriented approach in which the support strategies are typically pre-specified according to learners' difficulties in particular activities. This article proposes a more integrated approach, a triple scheme for learning…
Simulation-Based Approach for Site-Specific Optimization of Hydrokinetic Turbine Arrays
NASA Astrophysics Data System (ADS)
Sotiropoulos, F.; Chawdhary, S.; Yang, X.; Khosronejad, A.; Angelidis, D.
2014-12-01
A simulation-based approach has been developed to enable site-specific optimization of tidal and current turbine arrays in real-life waterways. The computational code is based on the St. Anthony Falls Laboratory Virtual StreamLab (VSL3D), which is able to carry out high-fidelity simulations of turbulent flow and sediment transport processes in rivers and streams taking into account the arbitrary geometrical complexity characterizing natural waterways. The computational framework can be used either in turbine-resolving mode, to take into account all geometrical details of the turbine, or with the turbines parameterized as actuator disks or actuator lines. Locally refined grids are employed to dramatically increase the resolution of the simulation and enable efficient simulations of multi-turbine arrays. Turbine/sediment interactions are simulated using the coupled hydro-morphodynamic module of VSL3D. The predictive capabilities of the resulting computational framework will be demonstrated by applying it to simulate turbulent flow past a tri-frame configuration of hydrokinetic turbines in a rigid-bed turbulent open channel flow as well as turbines mounted on mobile bed open channels to investigate turbine/sediment interactions. The utility of the simulation-based approach for guiding the optimal development of turbine arrays in real-life waterways will also be discussed and demonstrated. This work was supported by NSF grant IIP-1318201. Simulations were carried out at the Minnesota Supercomputing Institute.
Computational studies of physical properties of Nb-Si based alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, Lizhi
2015-04-16
The overall goal is to provide physical properties data supplementing experiments for thermodynamic modeling and other simulations such as phase filed simulation for microstructure and continuum simulations for mechanical properties. These predictive computational modeling and simulations may yield insights that can be used to guide materials design, processing, and manufacture. Ultimately, they may lead to usable Nb-Si based alloy which could play an important role in current plight towards greener energy. The main objectives of the proposed projects are: (1) developing a first principles method based supercell approach for calculating thermodynamic and mechanic properties of ordered crystals and disordered latticesmore » including solid solution; (2) application of the supercell approach to Nb-Si base alloy to compute physical properties data that can be used for thermodynamic modeling and other simulations to guide the optimal design of Nb-Si based alloy.« less
Enhancing Students' Employability through Business Simulation
ERIC Educational Resources Information Center
Avramenko, Alex
2012-01-01
Purpose: The purpose of this paper is to introduce an approach to business simulation with less dependence on business simulation software to provide innovative work experience within a programme of study, to boost students' confidence and employability. Design/methodology/approach: The paper is based on analysis of existing business simulation…
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
NASA Astrophysics Data System (ADS)
Chakraborty, S.; Dasgupta, A.; Das, R.; Kar, M.; Kundu, A.; Sarkar, C. K.
2017-12-01
In this paper, we explore the possibility of mapping devices designed in TCAD environment to its modeled version developed in cadence virtuoso environment using a look-up table (LUT) approach. Circuit simulation of newly designed devices in TCAD environment is a very slow and tedious process involving complex scripting. Hence, the LUT based modeling approach has been proposed as a faster and easier alternative in cadence environment. The LUTs are prepared by extracting data from the device characteristics obtained from device simulation in TCAD. A comparative study is shown between the TCAD simulation and the LUT-based alternative to showcase the accuracy of modeled devices. Finally the look-up table approach is used to evaluate the performance of circuits implemented using 14 nm nMOSFET.
Pilot estimates of glidepath and aim point during simulated landing approaches
NASA Technical Reports Server (NTRS)
Acree, C. W., Jr.
1981-01-01
Pilot perceptions of glidepath angle and aim point were measured during simulated landings. A fixed-base cockpit simulator was used with video recordings of simulated landing approaches shown on a video projector. Pilots estimated the magnitudes of approach errors during observation without attempting to make corrections. Pilots estimated glidepath angular errors well, but had difficulty estimating aim-point errors. The data make plausible the hypothesis that pilots are little concerned with aim point during most of an approach, concentrating instead on keeping close to the nominal glidepath and trusting this technique to guide them to the proper touchdown point.
Model-Based GN and C Simulation and Flight Software Development for Orion Missions beyond LEO
NASA Technical Reports Server (NTRS)
Odegard, Ryan; Milenkovic, Zoran; Henry, Joel; Buttacoli, Michael
2014-01-01
For Orion missions beyond low Earth orbit (LEO), the Guidance, Navigation, and Control (GN&C) system is being developed using a model-based approach for simulation and flight software. Lessons learned from the development of GN&C algorithms and flight software for the Orion Exploration Flight Test One (EFT-1) vehicle have been applied to the development of further capabilities for Orion GN&C beyond EFT-1. Continuing the use of a Model-Based Development (MBD) approach with the Matlab®/Simulink® tool suite, the process for GN&C development and analysis has been largely improved. Furthermore, a model-based simulation environment in Simulink, rather than an external C-based simulation, greatly eases the process for development of flight algorithms. The benefits seen by employing lessons learned from EFT-1 are described, as well as the approach for implementing additional MBD techniques. Also detailed are the key enablers for improvements to the MBD process, including enhanced configuration management techniques for model-based software systems, automated code and artifact generation, and automated testing and integration.
HuPSON: the human physiology simulation ontology
2013-01-01
Background Large biomedical simulation initiatives, such as the Virtual Physiological Human (VPH), are substantially dependent on controlled vocabularies to facilitate the exchange of information, of data and of models. Hindering these initiatives is a lack of a comprehensive ontology that covers the essential concepts of the simulation domain. Results We propose a first version of a newly constructed ontology, HuPSON, as a basis for shared semantics and interoperability of simulations, of models, of algorithms and of other resources in this domain. The ontology is based on the Basic Formal Ontology, and adheres to the MIREOT principles; the constructed ontology has been evaluated via structural features, competency questions and use case scenarios. The ontology is freely available at: http://www.scai.fraunhofer.de/en/business-research-areas/bioinformatics/downloads.html (owl files) and http://bishop.scai.fraunhofer.de/scaiview/ (browser). Conclusions HuPSON provides a framework for a) annotating simulation experiments, b) retrieving relevant information that are required for modelling, c) enabling interoperability of algorithmic approaches used in biomedical simulation, d) comparing simulation results and e) linking knowledge-based approaches to simulation-based approaches. It is meant to foster a more rapid uptake of semantic technologies in the modelling and simulation domain, with particular focus on the VPH domain. PMID:24267822
Accelerated rescaling of single Monte Carlo simulation runs with the Graphics Processing Unit (GPU).
Yang, Owen; Choi, Bernard
2013-01-01
To interpret fiber-based and camera-based measurements of remitted light from biological tissues, researchers typically use analytical models, such as the diffusion approximation to light transport theory, or stochastic models, such as Monte Carlo modeling. To achieve rapid (ideally real-time) measurement of tissue optical properties, especially in clinical situations, there is a critical need to accelerate Monte Carlo simulation runs. In this manuscript, we report on our approach using the Graphics Processing Unit (GPU) to accelerate rescaling of single Monte Carlo runs to calculate rapidly diffuse reflectance values for different sets of tissue optical properties. We selected MATLAB to enable non-specialists in C and CUDA-based programming to use the generated open-source code. We developed a software package with four abstraction layers. To calculate a set of diffuse reflectance values from a simulated tissue with homogeneous optical properties, our rescaling GPU-based approach achieves a reduction in computation time of several orders of magnitude as compared to other GPU-based approaches. Specifically, our GPU-based approach generated a diffuse reflectance value in 0.08ms. The transfer time from CPU to GPU memory currently is a limiting factor with GPU-based calculations. However, for calculation of multiple diffuse reflectance values, our GPU-based approach still can lead to processing that is ~3400 times faster than other GPU-based approaches.
OneSAF as an In-Stride Mission Command Asset
2014-06-01
implementation approach. While DARPA began with a funded project to complete the capability as a “ big bang ” approach the approach here is based on reuse and...Command (MC), Modeling and Simulation (M&S), Distributed Interactive Simulation (DIS) ABSTRACT: To provide greater interoperability and integration...within Mission Command (MC) Systems the One Semi-Automated Forces (OneSAF) entity level simulation is evolving from a tightly coupled client server
Characterizing MPI matching via trace-based simulation
Ferreira, Kurt Brian; Levy, Scott Larson Nicoll; Pedretti, Kevin; ...
2017-01-01
With the increased scale expected on future leadership-class systems, detailed information about the resource usage and performance of MPI message matching provides important insights into how to maintain application performance on next-generation systems. However, obtaining MPI message matching performance data is often not possible without significant effort. A common approach is to instrument an MPI implementation to collect relevant statistics. While this approach can provide important data, collecting matching data at runtime perturbs the application's execution, including its matching performance, and is highly dependent on the MPI library's matchlist implementation. In this paper, we introduce a trace-based simulation approach tomore » obtain detailed MPI message matching performance data for MPI applications without perturbing their execution. Using a number of key parallel workloads, we demonstrate that this simulator approach can rapidly and accurately characterize matching behavior. Specifically, we use our simulator to collect several important statistics about the operation of the MPI posted and unexpected queues. For example, we present data about search lengths and the duration that messages spend in the queues waiting to be matched. Here, data gathered using this simulation-based approach have significant potential to aid hardware designers in determining resource allocation for MPI matching functions and provide application and middleware developers with insight into the scalability issues associated with MPI message matching.« less
NASA Technical Reports Server (NTRS)
Ellis, D. R.; Raisinghani, S. C.
1979-01-01
A six-degree-of-freedom variable-response research aircraft was used to determine the minimum lateral-directional control power required for desirable and acceptable levels of handling qualities for the STOL landing approach task in a variety of simulated atmospheric disturbance conditions for a range of lateral-directional response characteristics. Topics covered include the in-flight simulator, crosswind simulation, turbulence simulation, test configurations, and evaluation procedures. Conclusions based on a limited sampling of simulated STOL transport configurations flown to touchdown out of 6 deg, 75 kt MLS approaches, usually with a sidestep maneuver are discussed.
Cárdenas-García, Maura; González-Pérez, Pedro Pablo
2013-04-11
Apoptotic cell death plays a crucial role in development and homeostasis. This process is driven by mitochondrial permeabilization and activation of caspases. In this paper we adopt a tuple spaces-based modelling and simulation approach, and show how it can be applied to the simulation of this intracellular signalling pathway. Specifically, we are working to explore and to understand the complex interaction patterns of the caspases apoptotic and the mitochondrial role. As a first approximation, using the tuple spaces-based in silico approach, we model and simulate both the extrinsic and intrinsic apoptotic signalling pathways and the interactions between them. During apoptosis, mitochondrial proteins, released from mitochondria to cytosol are decisively involved in the process. If the decision is to die, from this point there is normally no return, cancer cells offer resistance to the mitochondrial induction.
NASA Technical Reports Server (NTRS)
Neuhaus, Jason R.
2018-01-01
This document describes the heads-up display (HUD) used in a piloted lifting-body entry, approach and landing simulation developed for the simulator facilities of the Simulation Development and Analysis Branch (SDAB) at NASA Langley Research Center. The HUD symbology originated with the piloted simulation evaluations of the HL-20 lifting body concept conducted in 1989 at NASA Langley. The original symbology was roughly based on Shuttle HUD symbology, as interpreted by Langley researchers. This document focuses on the addition of the precision approach path indicator (PAPI) lights to the HUD overlay.
Lai, Ying-Hui; Chen, Fei; Wang, Syu-Siang; Lu, Xugang; Tsao, Yu; Lee, Chin-Hui
2017-07-01
In a cochlear implant (CI) speech processor, noise reduction (NR) is a critical component for enabling CI users to attain improved speech perception under noisy conditions. Identifying an effective NR approach has long been a key topic in CI research. Recently, a deep denoising autoencoder (DDAE) based NR approach was proposed and shown to be effective in restoring clean speech from noisy observations. It was also shown that DDAE could provide better performance than several existing NR methods in standardized objective evaluations. Following this success with normal speech, this paper further investigated the performance of DDAE-based NR to improve the intelligibility of envelope-based vocoded speech, which simulates speech signal processing in existing CI devices. We compared the performance of speech intelligibility between DDAE-based NR and conventional single-microphone NR approaches using the noise vocoder simulation. The results of both objective evaluations and listening test showed that, under the conditions of nonstationary noise distortion, DDAE-based NR yielded higher intelligibility scores than conventional NR approaches. This study confirmed that DDAE-based NR could potentially be integrated into a CI processor to provide more benefits to CI users under noisy conditions.
A Novel Approach to Visualizing Dark Matter Simulations.
Kaehler, R; Hahn, O; Abel, T
2012-12-01
In the last decades cosmological N-body dark matter simulations have enabled ab initio studies of the formation of structure in the Universe. Gravity amplified small density fluctuations generated shortly after the Big Bang, leading to the formation of galaxies in the cosmic web. These calculations have led to a growing demand for methods to analyze time-dependent particle based simulations. Rendering methods for such N-body simulation data usually employ some kind of splatting approach via point based rendering primitives and approximate the spatial distributions of physical quantities using kernel interpolation techniques, common in SPH (Smoothed Particle Hydrodynamics)-codes. This paper proposes three GPU-assisted rendering approaches, based on a new, more accurate method to compute the physical densities of dark matter simulation data. It uses full phase-space information to generate a tetrahedral tessellation of the computational domain, with mesh vertices defined by the simulation's dark matter particle positions. Over time the mesh is deformed by gravitational forces, causing the tetrahedral cells to warp and overlap. The new methods are well suited to visualize the cosmic web. In particular they preserve caustics, regions of high density that emerge, when several streams of dark matter particles share the same location in space, indicating the formation of structures like sheets, filaments and halos. We demonstrate the superior image quality of the new approaches in a comparison with three standard rendering techniques for N-body simulation data.
Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology
ERIC Educational Resources Information Center
Basawapatna, Ashok
2016-01-01
Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…
Applying a Web and Simulation-Based System for Adaptive Competence Assessment of Spinal Anaesthesia
NASA Astrophysics Data System (ADS)
Hockemeyer, Cord; Nussbaumer, Alexander; Lövquist, Erik; Aboulafia, Annette; Breen, Dorothy; Shorten, George; Albert, Dietrich
The authors present an approach for implementing a system for the assessment of medical competences using a haptic simulation device. Based on Competence based Knowledge Space Theory (CbKST), information on the learners’ competences is gathered from different sources (test questions, data from the simulator, and supervising experts’ assessments).
Application of Adaptive Autopilot Designs for an Unmanned Aerial Vehicle
NASA Technical Reports Server (NTRS)
Shin, Yoonghyun; Calise, Anthony J.; Motter, Mark A.
2005-01-01
This paper summarizes the application of two adaptive approaches to autopilot design, and presents an evaluation and comparison of the two approaches in simulation for an unmanned aerial vehicle. One approach employs two-stage dynamic inversion and the other employs feedback dynamic inversions based on a command augmentation system. Both are augmented with neural network based adaptive elements. The approaches permit adaptation to both parametric uncertainty and unmodeled dynamics, and incorporate a method that permits adaptation during periods of control saturation. Simulation results for an FQM-117B radio controlled miniature aerial vehicle are presented to illustrate the performance of the neural network based adaptation.
A Comparison of Three Approaches to Model Human Behavior
NASA Astrophysics Data System (ADS)
Palmius, Joel; Persson-Slumpi, Thomas
2010-11-01
One way of studying social processes is through the use of simulations. The use of simulations for this purpose has been established as its own field, social simulations, and has been used for studying a variety of phenomena. A simulation of a social setting can serve as an aid for thinking about that social setting, and for experimenting with different parameters and studying the outcomes caused by them. When using the simulation as an aid for thinking and experimenting, the chosen simulation approach will implicitly steer the simulationist towards thinking in a certain fashion in order to fit the model. To study the implications of model choice on the understanding of a setting where human anticipation comes into play, a simulation scenario of a coffee room was constructed using three different simulation approaches: Cellular Automata, Systems Dynamics and Agent-based modeling. The practical implementations of the models were done in three different simulation packages: Stella for Systems Dynamic, CaFun for Cellular automata and SesAM for Agent-based modeling. The models were evaluated both using Randers' criteria for model evaluation, and through introspection where the authors reflected upon how their understanding of the scenario was steered through the model choice. Further the software used for implementing the simulation models was evaluated, and practical considerations for the choice of software package are listed. It is concluded that the models have very different strengths. The Agent-based modeling approach offers the most intuitive support for thinking about and modeling a social setting where the behavior of the individual is in focus. The Systems Dynamics model would be preferable in situations where populations and large groups would be studied as wholes, but where individual behavior is of less concern. The Cellular Automata models would be preferable where processes need to be studied from the basis of a small set of very simple rules. It is further concluded that in most social simulation settings the Agent-based modeling approach would be the probable choice. This since the other models does not offer much in the way of supporting the modeling of the anticipatory behavior of humans acting in an organization.
LEGEND, a LEO-to-GEO Environment Debris Model
NASA Technical Reports Server (NTRS)
Liou, Jer Chyi; Hall, Doyle T.
2013-01-01
LEGEND (LEO-to-GEO Environment Debris model) is a three-dimensional orbital debris evolutionary model that is capable of simulating the historical and future debris populations in the near-Earth environment. The historical component in LEGEND adopts a deterministic approach to mimic the known historical populations. Launched rocket bodies, spacecraft, and mission-related debris (rings, bolts, etc.) are added to the simulated environment. Known historical breakup events are reproduced, and fragments down to 1 mm in size are created. The LEGEND future projection component adopts a Monte Carlo approach and uses an innovative pair-wise collision probability evaluation algorithm to simulate the future breakups and the growth of the debris populations. This algorithm is based on a new "random sampling in time" approach that preserves characteristics of the traditional approach and captures the rapidly changing nature of the orbital debris environment. LEGEND is a Fortran 90-based numerical simulation program. It operates in a UNIX/Linux environment.
A qualitative and quantitative assessment for a bone marrow harvest simulator.
Machado, Liliane S; Moraes, Ronei M
2009-01-01
Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.
NASA Astrophysics Data System (ADS)
Baumgart, M.; Druml, N.; Consani, M.
2018-05-01
This paper presents a simulation approach for Time-of-Flight cameras to estimate sensor performance and accuracy, as well as to help understanding experimentally discovered effects. The main scope is the detailed simulation of the optical signals. We use a raytracing-based approach and use the optical path length as the master parameter for depth calculations. The procedure is described in detail with references to our implementation in Zemax OpticStudio and Python. Our simulation approach supports multiple and extended light sources and allows accounting for all effects within the geometrical optics model. Especially multi-object reflection/scattering ray-paths, translucent objects, and aberration effects (e.g. distortion caused by the ToF lens) are supported. The optical path length approach also enables the implementation of different ToF senor types and transient imaging evaluations. The main features are demonstrated on a simple 3D test scene.
NASA Astrophysics Data System (ADS)
Chiadamrong, N.; Piyathanavong, V.
2017-12-01
Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.
A simulation-based approach for estimating premining water quality: Red Mountain Creek, Colorado
Runkel, Robert L.; Kimball, Briant A; Walton-Day, Katherine; Verplanck, Philip L.
2007-01-01
Regulatory agencies are often charged with the task of setting site-specific numeric water quality standards for impaired streams. This task is particularly difficult for streams draining highly mineralized watersheds with past mining activity. Baseline water quality data obtained prior to mining are often non-existent and application of generic water quality standards developed for unmineralized watersheds is suspect given the geology of most watersheds affected by mining. Various approaches have been used to estimate premining conditions, but none of the existing approaches rigorously consider the physical and geochemical processes that ultimately determine instream water quality. An approach based on simulation modeling is therefore proposed herein. The approach utilizes synoptic data that provide spatially-detailed profiles of concentration, streamflow, and constituent load along the study reach. This field data set is used to calibrate a reactive stream transport model that considers the suite of physical and geochemical processes that affect constituent concentrations during instream transport. A key input to the model is the quality and quantity of waters entering the study reach. This input is based on chemical analyses available from synoptic sampling and observed increases in streamflow along the study reach. Given the calibrated model, additional simulations are conducted to estimate premining conditions. In these simulations, the chemistry of mining-affected sources is replaced with the chemistry of waters that are thought to be unaffected by mining (proximal, premining analogues). The resultant simulations provide estimates of premining water quality that reflect both the reduced loads that were present prior to mining and the processes that affect these loads as they are transported downstream. This simulation-based approach is demonstrated using data from Red Mountain Creek, Colorado, a small stream draining a heavily-mined watershed. Model application to the premining problem for Red Mountain Creek is based on limited field reconnaissance and chemical analyses; additional field work and analyses may be needed to develop definitive, quantitative estimates of premining water quality.
Pressure gradients fail to predict diffusio-osmosis
NASA Astrophysics Data System (ADS)
Liu, Yawei; Ganti, Raman; Frenkel, Daan
2018-05-01
We present numerical simulations of diffusio-osmotic flow, i.e. the fluid flow generated by a concentration gradient along a solid-fluid interface. In our study, we compare a number of distinct approaches that have been proposed for computing such flows and compare them with a reference calculation based on direct, non-equilibrium molecular dynamics simulations. As alternatives, we consider schemes that compute diffusio-osmotic flow from the gradient of the chemical potentials of the constituent species and from the gradient of the component of the pressure tensor parallel to the interface. We find that the approach based on treating chemical potential gradients as external forces acting on various species agrees with the direct simulations, thereby supporting the approach of Marbach et al (2017 J. Chem. Phys. 146 194701). In contrast, an approach based on computing the gradients of the microscopic pressure tensor does not reproduce the direct non-equilibrium results.
Stochastic simulation of multiscale complex systems with PISKaS: A rule-based approach.
Perez-Acle, Tomas; Fuenzalida, Ignacio; Martin, Alberto J M; Santibañez, Rodrigo; Avaria, Rodrigo; Bernardin, Alejandro; Bustos, Alvaro M; Garrido, Daniel; Dushoff, Jonathan; Liu, James H
2018-03-29
Computational simulation is a widely employed methodology to study the dynamic behavior of complex systems. Although common approaches are based either on ordinary differential equations or stochastic differential equations, these techniques make several assumptions which, when it comes to biological processes, could often lead to unrealistic models. Among others, model approaches based on differential equations entangle kinetics and causality, failing when complexity increases, separating knowledge from models, and assuming that the average behavior of the population encompasses any individual deviation. To overcome these limitations, simulations based on the Stochastic Simulation Algorithm (SSA) appear as a suitable approach to model complex biological systems. In this work, we review three different models executed in PISKaS: a rule-based framework to produce multiscale stochastic simulations of complex systems. These models span multiple time and spatial scales ranging from gene regulation up to Game Theory. In the first example, we describe a model of the core regulatory network of gene expression in Escherichia coli highlighting the continuous model improvement capacities of PISKaS. The second example describes a hypothetical outbreak of the Ebola virus occurring in a compartmentalized environment resembling cities and highways. Finally, in the last example, we illustrate a stochastic model for the prisoner's dilemma; a common approach from social sciences describing complex interactions involving trust within human populations. As whole, these models demonstrate the capabilities of PISKaS providing fertile scenarios where to explore the dynamics of complex systems. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Li, Hongzhi; Yang, Wei
2007-03-21
An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.
1988-04-13
Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle Mark S. Fox, Nizwer Husain, Malcolm...McRoberts and Y.V.Reddy CMU-RI-TR-88-5 Intelligent Systems Laboratory The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania D T T 13...years of research in the application of Artificial Intelligence to Simulation. Our focus has been in two areas: the use of Al knowledge representation
Multiagent intelligent systems
NASA Astrophysics Data System (ADS)
Krause, Lee S.; Dean, Christopher; Lehman, Lynn A.
2003-09-01
This paper will discuss a simulation approach based upon a family of agent-based models. As the demands placed upon simulation technology by such applications as Effects Based Operations (EBO), evaluations of indicators and warnings surrounding homeland defense and commercial demands such financial risk management current single thread based simulations will continue to show serious deficiencies. The types of "what if" analysis required to support these types of applications, demand rapidly re-configurable approaches capable of aggregating large models incorporating multiple viewpoints. The use of agent technology promises to provide a broad spectrum of models incorporating differing viewpoints through a synthesis of a collection of models. Each model would provide estimates to the overall scenario based upon their particular measure or aspect. An agent framework, denoted as the "family" would provide a common ontology in support of differing aspects of the scenario. This approach permits the future of modeling to change from viewing the problem as a single thread simulation, to take into account multiple viewpoints from different models. Even as models are updated or replaced the agent approach permits rapid inclusion in new or modified simulations. In this approach a variety of low and high-resolution information and its synthesis requires a family of models. Each agent "publishes" its support for a given measure and each model provides their own estimates on the scenario based upon their particular measure or aspect. If more than one agent provides the same measure (e.g. cognitive) then the results from these agents are combined to form an aggregate measure response. The objective would be to inform and help calibrate a qualitative model, rather than merely to present highly aggregated statistical information. As each result is processed, the next action can then be determined. This is done by a top-level decision system that communicates to the family at the ontology level without any specific understanding of the processes (or model) behind each agent. The increasingly complex demands upon simulation for the necessity to incorporate the breadth and depth of influencing factors makes a family of agent based models a promising solution. This paper will discuss that solution with syntax and semantics necessary to support the approach.
Web-based Interactive Simulator for Rotating Machinery.
ERIC Educational Resources Information Center
Sirohi, Vijayalaxmi
1999-01-01
Baroma (Balance of Rotating Machinery), the Web-based educational engineering interactive software for teaching/learning combines didactical and software ergonomical approaches. The software in tutorial form simulates a problem using Visual Interactive Simulation in graphic display, and animation is brought about through graphical user interface…
Simulator evaluation of manually flown curved instrument approaches. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sager, D.
1973-01-01
Pilot performance in flying horizontally curved instrument approaches was analyzed by having nine test subjects fly curved approaches in a fixed-base simulator. Approaches were flown without an autopilot and without a flight director. Evaluations were based on deviation measurements made at a number of points along the curved approach path and on subject questionnaires. Results indicate that pilots can fly curved approaches, though less accurately than straight-in approaches; that a moderate wind does not effect curve flying performance; and that there is no performance difference between 60 deg. and 90 deg. turns. A tradeoff of curve path parameters and a paper analysis of wind compensation were also made.
Post Pareto optimization-A case
NASA Astrophysics Data System (ADS)
Popov, Stoyan; Baeva, Silvia; Marinova, Daniela
2017-12-01
Simulation performance may be evaluated according to multiple quality measures that are in competition and their simultaneous consideration poses a conflict. In the current study we propose a practical framework for investigating such simulation performance criteria, exploring the inherent conflicts amongst them and identifying the best available tradeoffs, based upon multi-objective Pareto optimization. This approach necessitates the rigorous derivation of performance criteria to serve as objective functions and undergo vector optimization. We demonstrate the effectiveness of our proposed approach by applying it with multiple stochastic quality measures. We formulate performance criteria of this use-case, pose an optimization problem, and solve it by means of a simulation-based Pareto approach. Upon attainment of the underlying Pareto Frontier, we analyze it and prescribe preference-dependent configurations for the optimal simulation training.
1D-3D hybrid modeling-from multi-compartment models to full resolution models in space and time.
Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M; Queisser, Gillian
2014-01-01
Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator-which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics.
ERIC Educational Resources Information Center
Gale, Jessica; Wind, Stefanie; Koval, Jayma; Dagosta, Joseph; Ryan, Mike; Usselman, Marion
2016-01-01
This paper illustrates the use of simulation-based performance assessment (PA) methodology in a recent study of eighth-grade students' understanding of physical science concepts. A set of four simulation-based PA tasks were iteratively developed to assess student understanding of an array of physical science concepts, including net force,…
Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.
2014-01-01
The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083
Simulating bimodal tall fescue growth with a degree-day-based process-oriented plant model
USDA-ARS?s Scientific Manuscript database
Plant growth simulation models have a temperature response function driving development, with a base temperature and an optimum temperature defined. Such growth simulation models often function well when plant development rate shows a continuous change throughout the growing season. This approach ...
ERIC Educational Resources Information Center
Lee, James R.
1989-01-01
Discussion of the use of simulations to teach international relations (IR) highlights the Chinese House Game, a computer-based decision-making game based on Inter Nation Simulation (INS). Topics discussed include the increasing role of artificial intelligence in IR simulations, multi-disciplinary approaches, and the direction of IR as a…
Use case driven approach to develop simulation model for PCS of APR1400 simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong Wook, Kim; Hong Soo, Kim; Hyeon Tae, Kang
2006-07-01
The full-scope simulator is being developed to evaluate specific design feature and to support the iterative design and validation in the Man-Machine Interface System (MMIS) design of Advanced Power Reactor (APR) 1400. The simulator consists of process model, control logic model, and MMI for the APR1400 as well as the Power Control System (PCS). In this paper, a use case driven approach is proposed to develop a simulation model for PCS. In this approach, a system is considered from the point of view of its users. User's view of the system is based on interactions with the system and themore » resultant responses. In use case driven approach, we initially consider the system as a black box and look at its interactions with the users. From these interactions, use cases of the system are identified. Then the system is modeled using these use cases as functions. Lower levels expand the functionalities of each of these use cases. Hence, starting from the topmost level view of the system, we proceeded down to the lowest level (the internal view of the system). The model of the system thus developed is use case driven. This paper will introduce the functionality of the PCS simulation model, including a requirement analysis based on use case and the validation result of development of PCS model. The PCS simulation model using use case will be first used during the full-scope simulator development for nuclear power plant and will be supplied to Shin-Kori 3 and 4 plant. The use case based simulation model development can be useful for the design and implementation of simulation models. (authors)« less
Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi
2017-01-01
Objectives We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Methods Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students’ recall of cases in three categories: video, paper, and non-experienced. Results Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (p<0.001), and satisfaction with sessions (p=0.001). No significant differences existed in the discussion contents diversity regarding the International Classification of Primary Care Second Edition codes and chapter types or in the rate of psychological codes. In a follow-up survey comparing video and paper groups to non-experienced groups, the rates were higher for video (χ2=24.319, p<0.001) and paper (χ2=11.134, p=0.001). Although the video rate tended to be higher than the paper rate, no significant difference was found between the two. Conclusions Patient-simulated videos showing daily life facilitate imagining true patients and support a comprehensive approach that fosters better memory. The clinical patient-simulated video method is more practical and clinical problem-based tutorials can be implemented if we create patient-simulated videos for each symptom as teaching materials. PMID:28245193
Ikegami, Akiko; Ohira, Yoshiyuki; Uehara, Takanori; Noda, Kazutaka; Suzuki, Shingo; Shikino, Kiyoshi; Kajiwara, Hideki; Kondo, Takeshi; Hirota, Yusuke; Ikusaka, Masatomi
2017-02-27
We examined whether problem-based learning tutorials using patient-simulated videos showing daily life are more practical for clinical learning, compared with traditional paper-based problem-based learning, for the consideration rate of psychosocial issues and the recall rate for experienced learning. Twenty-two groups with 120 fifth-year students were each assigned paper-based problem-based learning and video-based problem-based learning using patient-simulated videos. We compared target achievement rates in questionnaires using the Wilcoxon signed-rank test and discussion contents diversity using the Mann-Whitney U test. A follow-up survey used a chi-square test to measure students' recall of cases in three categories: video, paper, and non-experienced. Video-based problem-based learning displayed significantly higher achievement rates for imagining authentic patients (p=0.001), incorporating a comprehensive approach including psychosocial aspects (p<0.001), and satisfaction with sessions (p=0.001). No significant differences existed in the discussion contents diversity regarding the International Classification of Primary Care Second Edition codes and chapter types or in the rate of psychological codes. In a follow-up survey comparing video and paper groups to non-experienced groups, the rates were higher for video (χ 2 =24.319, p<0.001) and paper (χ 2 =11.134, p=0.001). Although the video rate tended to be higher than the paper rate, no significant difference was found between the two. Patient-simulated videos showing daily life facilitate imagining true patients and support a comprehensive approach that fosters better memory. The clinical patient-simulated video method is more practical and clinical problem-based tutorials can be implemented if we create patient-simulated videos for each symptom as teaching materials.
A Component-Based FPGA Design Framework for Neuronal Ion Channel Dynamics Simulations
Mak, Terrence S. T.; Rachmuth, Guy; Lam, Kai-Pui; Poon, Chi-Sang
2008-01-01
Neuron-machine interfaces such as dynamic clamp and brain-implantable neuroprosthetic devices require real-time simulations of neuronal ion channel dynamics. Field Programmable Gate Array (FPGA) has emerged as a high-speed digital platform ideal for such application-specific computations. We propose an efficient and flexible component-based FPGA design framework for neuronal ion channel dynamics simulations, which overcomes certain limitations of the recently proposed memory-based approach. A parallel processing strategy is used to minimize computational delay, and a hardware-efficient factoring approach for calculating exponential and division functions in neuronal ion channel models is used to conserve resource consumption. Performances of the various FPGA design approaches are compared theoretically and experimentally in corresponding implementations of the AMPA and NMDA synaptic ion channel models. Our results suggest that the component-based design framework provides a more memory economic solution as well as more efficient logic utilization for large word lengths, whereas the memory-based approach may be suitable for time-critical applications where a higher throughput rate is desired. PMID:17190033
1D-3D hybrid modeling—from multi-compartment models to full resolution models in space and time
Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M.; Queisser, Gillian
2014-01-01
Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator—which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics. PMID:25120463
Residents' perceptions of simulation as a clinical learning approach.
Walsh, Catharine M; Garg, Ankit; Ng, Stella L; Goyal, Fenny; Grover, Samir C
2017-02-01
Simulation is increasingly being integrated into medical education; however, there is little research into trainees' perceptions of this learning modality. We elicited trainees' perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4-6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Residents' perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents' markedly narrow perception of simulation's capacity to support non-technical skills development or its use beyond introductory learning. Trainees' learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees' a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation.
An approach for modelling snowcover ablation and snowmelt runoff in cold region environments
NASA Astrophysics Data System (ADS)
Dornes, Pablo Fernando
Reliable hydrological model simulations are the result of numerous complex interactions among hydrological inputs, landscape properties, and initial conditions. Determination of the effects of these factors is one of the main challenges in hydrological modelling. This situation becomes even more difficult in cold regions due to the ungauged nature of subarctic and arctic environments. This research work is an attempt to apply a new approach for modelling snowcover ablation and snowmelt runoff in complex subarctic environments with limited data while retaining integrity in the process representations. The modelling strategy is based on the incorporation of both detailed process understanding and inputs along with information gained from observations of basin-wide streamflow phenomenon; essentially a combination of deductive and inductive approaches. The study was conducted in the Wolf Creek Research Basin, Yukon Territory, using three models, a small-scale physically based hydrological model, a land surface scheme, and a land surface hydrological model. The spatial representation was based on previous research studies and observations, and was accomplished by incorporating landscape units, defined according to topography and vegetation, as the spatial model elements. Comparisons between distributed and aggregated modelling approaches showed that simulations incorporating distributed initial snowcover and corrected solar radiation were able to properly simulate snowcover ablation and snowmelt runoff whereas the aggregated modelling approaches were unable to represent the differential snowmelt rates and complex snowmelt runoff dynamics. Similarly, the inclusion of spatially distributed information in a land surface scheme clearly improved simulations of snowcover ablation. Application of the same modelling approach at a larger scale using the same landscape based parameterisation showed satisfactory results in simulating snowcover ablation and snowmelt runoff with minimal calibration. Verification of this approach in an arctic basin illustrated that landscape based parameters are a feasible regionalisation framework for distributed and physically based models. In summary, the proposed modelling philosophy, based on the combination of an inductive and deductive reasoning, is a suitable strategy for reliable predictions of snowcover ablation and snowmelt runoff in cold regions and complex environments.
Dilatancy Criteria for Salt Cavern Design: A Comparison Between Stress- and Strain-Based Approaches
NASA Astrophysics Data System (ADS)
Labaune, P.; Rouabhi, A.; Tijani, M.; Blanco-Martín, L.; You, T.
2018-02-01
This paper presents a new approach for salt cavern design, based on the use of the onset of dilatancy as a design threshold. In the proposed approach, a rheological model that includes dilatancy at the constitutive level is developed, and a strain-based dilatancy criterion is defined. As compared to classical design methods that consist in simulating cavern behavior through creep laws (fitted on long-term tests) and then using a criterion (derived from short-terms tests or experience) to determine the stability of the excavation, the proposed approach is consistent both with short- and long-term conditions. The new strain-based dilatancy criterion is compared to a stress-based dilatancy criterion through numerical simulations of salt caverns under cyclic loading conditions. The dilatancy zones predicted by the strain-based criterion are larger than the ones predicted by the stress-based criteria, which is conservative yet constructive for design purposes.
2012-05-30
annealing-based or Bayesian sequential simulation approaches B. Dafflon1,2 and W. Barrash1 Received 13 May 2011; revised 12 March 2012; accepted 17 April 2012...the withheld porosity log are also withheld for this estimation process. For both cases we do this for two wells having locally variable stratigraphy ...borehole location is given at the bottom of each log comparison panel. For comparison with stratigraphy at the BHRS, contacts between Units 1 to 4
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
NASA Astrophysics Data System (ADS)
Qi, Wei; Liu, Junguo; Yang, Hong; Sweetapple, Chris
2018-03-01
Global precipitation products are very important datasets in flow simulations, especially in poorly gauged regions. Uncertainties resulting from precipitation products, hydrological models and their combinations vary with time and data magnitude, and undermine their application to flow simulations. However, previous studies have not quantified these uncertainties individually and explicitly. This study developed an ensemble-based dynamic Bayesian averaging approach (e-Bay) for deterministic discharge simulations using multiple global precipitation products and hydrological models. In this approach, the joint probability of precipitation products and hydrological models being correct is quantified based on uncertainties in maximum and mean estimation, posterior probability is quantified as functions of the magnitude and timing of discharges, and the law of total probability is implemented to calculate expected discharges. Six global fine-resolution precipitation products and two hydrological models of different complexities are included in an illustrative application. e-Bay can effectively quantify uncertainties and therefore generate better deterministic discharges than traditional approaches (weighted average methods with equal and varying weights and maximum likelihood approach). The mean Nash-Sutcliffe Efficiency values of e-Bay are up to 0.97 and 0.85 in training and validation periods respectively, which are at least 0.06 and 0.13 higher than traditional approaches. In addition, with increased training data, assessment criteria values of e-Bay show smaller fluctuations than traditional approaches and its performance becomes outstanding. The proposed e-Bay approach bridges the gap between global precipitation products and their pragmatic applications to discharge simulations, and is beneficial to water resources management in ungauged or poorly gauged regions across the world.
A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation
NASA Astrophysics Data System (ADS)
Byun, K.; Hamlet, A. F.
2017-12-01
There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.
Residents’ perceptions of simulation as a clinical learning approach
Walsh, Catharine M.; Garg, Ankit; Ng, Stella L.; Goyal, Fenny; Grover, Samir C.
2017-01-01
Background Simulation is increasingly being integrated into medical education; however, there is little research into trainees’ perceptions of this learning modality. We elicited trainees’ perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. Methods We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4–6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Results Residents’ perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents’ markedly narrow perception of simulation’s capacity to support non-technical skills development or its use beyond introductory learning. Conclusion Trainees’ learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees’ a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation. PMID:28344719
A Storm's Approach; Hurricane Shelter Training in a Digital Age
NASA Technical Reports Server (NTRS)
Boyarsky, Andrew; Burden, David; Gronstedt, Anders; Jinman, Andrew
2012-01-01
New York City's Office of Emergency Management (OEM) originally ran hundreds of classroom based courses, where they brought together civil servants to learn how to run a Hurricane Shelter (HS). This approach was found to be costly, time consuming and lacked any sense of an impending disaster and need for emergency response. In partnership with the City of New York University School of Professional studies, Gronstedt Group and Daden Limited, the OEM wanted to create a simulation that overcame these issues, providing users with a more immersive and realistic approach at a lower cost. The HS simulation was built in the virtual world Second Life (SL). Virtual worlds are a genre of online communities that often take the form of a computer-based simulated environments, through which users can interact with one another and use or create objects. Using this technology allowed managers to apply their knowledge in both classroom and remote learning environments. The shelter simulation is operational 24/7, guiding users through a 4 1/2 hour narrative from start to finish. This paper will describe the rationale for the project, the technical approach taken - particularly the use of a web based authoring tool to create and manage the immersive simulation, and the results from operational use.
Rule-based spatial modeling with diffusing, geometrically constrained molecules.
Gruenert, Gerd; Ibrahim, Bashar; Lenser, Thorsten; Lohel, Maiko; Hinze, Thomas; Dittrich, Peter
2010-06-07
We suggest a new type of modeling approach for the coarse grained, particle-based spatial simulation of combinatorially complex chemical reaction systems. In our approach molecules possess a location in the reactor as well as an orientation and geometry, while the reactions are carried out according to a list of implicitly specified reaction rules. Because the reaction rules can contain patterns for molecules, a combinatorially complex or even infinitely sized reaction network can be defined. For our implementation (based on LAMMPS), we have chosen an already existing formalism (BioNetGen) for the implicit specification of the reaction network. This compatibility allows to import existing models easily, i.e., only additional geometry data files have to be provided. Our simulations show that the obtained dynamics can be fundamentally different from those simulations that use classical reaction-diffusion approaches like Partial Differential Equations or Gillespie-type spatial stochastic simulation. We show, for example, that the combination of combinatorial complexity and geometric effects leads to the emergence of complex self-assemblies and transportation phenomena happening faster than diffusion (using a model of molecular walkers on microtubules). When the mentioned classical simulation approaches are applied, these aspects of modeled systems cannot be observed without very special treatment. Further more, we show that the geometric information can even change the organizational structure of the reaction system. That is, a set of chemical species that can in principle form a stationary state in a Differential Equation formalism, is potentially unstable when geometry is considered, and vice versa. We conclude that our approach provides a new general framework filling a gap in between approaches with no or rigid spatial representation like Partial Differential Equations and specialized coarse-grained spatial simulation systems like those for DNA or virus capsid self-assembly.
Rule-based spatial modeling with diffusing, geometrically constrained molecules
2010-01-01
Background We suggest a new type of modeling approach for the coarse grained, particle-based spatial simulation of combinatorially complex chemical reaction systems. In our approach molecules possess a location in the reactor as well as an orientation and geometry, while the reactions are carried out according to a list of implicitly specified reaction rules. Because the reaction rules can contain patterns for molecules, a combinatorially complex or even infinitely sized reaction network can be defined. For our implementation (based on LAMMPS), we have chosen an already existing formalism (BioNetGen) for the implicit specification of the reaction network. This compatibility allows to import existing models easily, i.e., only additional geometry data files have to be provided. Results Our simulations show that the obtained dynamics can be fundamentally different from those simulations that use classical reaction-diffusion approaches like Partial Differential Equations or Gillespie-type spatial stochastic simulation. We show, for example, that the combination of combinatorial complexity and geometric effects leads to the emergence of complex self-assemblies and transportation phenomena happening faster than diffusion (using a model of molecular walkers on microtubules). When the mentioned classical simulation approaches are applied, these aspects of modeled systems cannot be observed without very special treatment. Further more, we show that the geometric information can even change the organizational structure of the reaction system. That is, a set of chemical species that can in principle form a stationary state in a Differential Equation formalism, is potentially unstable when geometry is considered, and vice versa. Conclusions We conclude that our approach provides a new general framework filling a gap in between approaches with no or rigid spatial representation like Partial Differential Equations and specialized coarse-grained spatial simulation systems like those for DNA or virus capsid self-assembly. PMID:20529264
Hydrological modelling in forested systems | Science ...
This chapter provides a brief overview of forest hydrology modelling approaches for answering important global research and management questions. Many hundreds of hydrological models have been applied globally across multiple decades to represent and predict forest hydrological processes. The focus of this chapter is on process-based models and approaches, specifically 'forest hydrology models'; that is, physically based simulation tools that quantify compartments of the forest hydrological cycle. Physically based models can be considered those that describe the conservation of mass, momentum and/or energy. The purpose of this chapter is to provide a brief overview of forest hydrology modeling approaches for answering important global research and management questions. The focus of this chapter is on process-based models and approaches, specifically “forest hydrology models”, i.e., physically-based simulation tools that quantify compartments of the forest hydrological cycle.
NASA Astrophysics Data System (ADS)
Antonetti, Manuel; Buss, Rahel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano
2016-07-01
The identification of landscapes with similar hydrological behaviour is useful for runoff and flood predictions in small ungauged catchments. An established method for landscape classification is based on the concept of dominant runoff process (DRP). The various DRP-mapping approaches differ with respect to the time and data required for mapping. Manual approaches based on expert knowledge are reliable but time-consuming, whereas automatic GIS-based approaches are easier to implement but rely on simplifications which restrict their application range. To what extent these simplifications are applicable in other catchments is unclear. More information is also needed on how the different complexities of automatic DRP-mapping approaches affect hydrological simulations. In this paper, three automatic approaches were used to map two catchments on the Swiss Plateau. The resulting maps were compared to reference maps obtained with manual mapping. Measures of agreement and association, a class comparison, and a deviation map were derived. The automatically derived DRP maps were used in synthetic runoff simulations with an adapted version of the PREVAH hydrological model, and simulation results compared with those from simulations using the reference maps. The DRP maps derived with the automatic approach with highest complexity and data requirement were the most similar to the reference maps, while those derived with simplified approaches without original soil information differed significantly in terms of both extent and distribution of the DRPs. The runoff simulations derived from the simpler DRP maps were more uncertain due to inaccuracies in the input data and their coarse resolution, but problems were also linked with the use of topography as a proxy for the storage capacity of soils. The perception of the intensity of the DRP classes also seems to vary among the different authors, and a standardised definition of DRPs is still lacking. Furthermore, we argue not to use expert knowledge for only model building and constraining, but also in the phase of landscape classification.
NASA Technical Reports Server (NTRS)
Miller, G. K., Jr.; Deal, P. L.
1975-01-01
The simulation employed all six rigid-body degrees of freedom and incorporated aerodynamic characteristics based on wind-tunnel data. The flight instrumentation included a localizer and a flight director which was used to capture and to maintain a two-segment glide slope. A closed-circuit television display of a STOLport provided visual cues during simulations of the approach and landing. The decoupled longitudinal controls used constant prefilter and feedback gains to provide steady-state decoupling of flight-path angle, pitch angle, and forward velocity. The pilots were enthusiastic about the decoupled longitudinal controls and believed that the simulator motion was an aid in evaluating the decoupled controls, although a minimum turbulence level with root-mean-square gust intensity of 0.3 m/sec (1 ft/sec) was required to mask undesirable characteristics of the moving-base simulator.
Variation and adaptation: learning from success in patient safety-oriented simulation training.
Dieckmann, Peter; Patterson, Mary; Lahlou, Saadi; Mesman, Jessica; Nyström, Patrik; Krage, Ralf
2017-01-01
Simulation is traditionally used to reduce errors and their negative consequences. But according to modern safety theories, this focus overlooks the learning potential of the positive performance, which is much more common than errors. Therefore, a supplementary approach to simulation is needed to unfold its full potential. In our commentary, we describe the learning from success (LFS) approach to simulation and debriefing. Drawing on several theoretical frameworks, we suggest supplementing the widespread deficit-oriented, corrective approach to simulation with an approach that focusses on systematically understanding how good performance is produced in frequent (mundane) simulation scenarios. We advocate to investigate and optimize human activity based on the connected layers of any setting: the embodied competences of the healthcare professionals, the social and organizational rules that guide their actions, and the material aspects of the setting. We discuss implications of these theoretical perspectives for the design and conduct of simulation scenarios, post-simulation debriefings, and faculty development programs.
Huang, Grace C; McSparron, Jakob I; Balk, Ethan M; Richards, Jeremy B; Smith, C Christopher; Whelan, Julia S; Newman, Lori R; Smetana, Gerald W
2016-04-01
Optimal approaches to teaching bedside procedures are unknown. To identify effective instructional approaches in procedural training. We searched PubMed, EMBASE, Web of Science and Cochrane Library through December 2014. We included research articles that addressed procedural training among physicians or physician trainees for 12 bedside procedures. Two independent reviewers screened 9312 citations and identified 344 articles for full-text review. Two independent reviewers extracted data from full-text articles. We included measurements as classified by translational science outcomes T1 (testing settings), T2 (patient care practices) and T3 (patient/public health outcomes). Due to incomplete reporting, we post hoc classified study outcomes as 'negative' or 'positive' based on statistical significance. We performed meta-analyses of outcomes on the subset of studies sharing similar outcomes. We found 161 eligible studies (44 randomised controlled trials (RCTs), 34 non-RCTs and 83 uncontrolled trials). Simulation was the most frequently published educational mode (78%). Our post hoc classification showed that studies involving simulation, competency-based approaches and RCTs had higher frequencies of T2/T3 outcomes. Meta-analyses showed that simulation (risk ratio (RR) 1.54 vs 0.55 for studies with vs without simulation, p=0.013) and competency-based approaches (RR 3.17 vs 0.89, p<0.001) were effective forms of training. This systematic review of bedside procedural skills demonstrates that the current literature is heterogeneous and of varying quality and rigour. Evidence is strongest for the use of simulation and competency-based paradigms in teaching procedures, and these approaches should be the mainstay of programmes that train physicians to perform procedures. Further research should clarify differences among instructional methods (eg, forms of hands-on training) rather than among educational modes (eg, lecture vs simulation). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Tait, Lauren; Lee, Kenneth; Rasiah, Rohan; Cooper, Joyce M; Ling, Tristan; Geelan, Benjamin; Bindoff, Ivan
2018-05-03
Background . There are numerous approaches to simulating a patient encounter in pharmacy education. However, little direct comparison between these approaches has been undertaken. Our objective was to investigate student experiences, satisfaction, and feedback preferences between three scenario simulation modalities (paper-, actor-, and computer-based). Methods . We conducted a mixed methods study with randomized cross-over of simulation modalities on final-year Australian graduate-entry Master of Pharmacy students. Participants completed case-based scenarios within each of three simulation modalities, with feedback provided at the completion of each scenario in a format corresponding to each simulation modality. A post-simulation questionnaire collected qualitative and quantitative responses pertaining to participant satisfaction, experiences, and feedback preferences. Results . Participants reported similar levels satisfaction across all three modalities. However, each modality resulted in unique positive and negative experiences, such as student disengagement with paper-based scenarios. Conclusion . Importantly, the themes of guidance and opportunity for peer discussion underlie the best forms of feedback for students. The provision of feedback following simulation should be carefully considered and delivered, with all three simulation modalities producing both positive and negative experiences in regard to their feedback format.
Continuity-based model interfacing for plant-wide simulation: a general approach.
Volcke, Eveline I P; van Loosdrecht, Mark C M; Vanrolleghem, Peter A
2006-08-01
In plant-wide simulation studies of wastewater treatment facilities, often existing models from different origin need to be coupled. However, as these submodels are likely to contain different state variables, their coupling is not straightforward. The continuity-based interfacing method (CBIM) provides a general framework to construct model interfaces for models of wastewater systems, taking into account conservation principles. In this contribution, the CBIM approach is applied to study the effect of sludge digestion reject water treatment with a SHARON-Anammox process on a plant-wide scale. Separate models were available for the SHARON process and for the Anammox process. The Benchmark simulation model no. 2 (BSM2) is used to simulate the behaviour of the complete WWTP including sludge digestion. The CBIM approach is followed to develop three different model interfaces. At the same time, the generally applicable CBIM approach was further refined and particular issues when coupling models in which pH is considered as a state variable, are pointed out.
Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2000-01-01
A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.
Simulators IV; Proceedings of the SCS Conference, Orlando, FL, Apr. 6-9, 1987
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fairchild, B.T.
1987-01-01
The conference presents papers on the applicability of AI techniques to simulation models, the simulation of a reentry vehicle on Simstar, simstar missile simulation, measurement issues associated with simulator sickness, and tracing the etiology of simulator sickness. Consideration is given to a simulator of a steam generator tube bundle response to a blowdown transient, the census of simulators for fossil fueled boiler and gas turbine plant operation training, and a new approach for flight simulator visual systems. Other topics include past and present simulated aircraft maintenance trainers, an AI-simulation based approach for aircraft maintenance training, simulator qualification using EPRI methodology,more » and the role of instinct in organizational dysfunction.« less
Sperling, Jeremy D.; Clark, Sunday; Kang, Yoon
2013-01-01
Introduction Simulation-based medical education (SBME) is increasingly being utilized for teaching clinical skills in undergraduate medical education. Studies have evaluated the impact of adding SBME to third- and fourth-year curriculum; however, very little research has assessed its efficacy for teaching clinical skills in pre-clerkship coursework. To measure the impact of a simulation exercise during a pre-clinical curriculum, a simulation session was added to a pre-clerkship course at our medical school where the clinical approach to altered mental status (AMS) is traditionally taught using a lecture and an interactive case-based session in a small group format. The objective was to measure simulation's impact on students’ knowledge acquisition, comfort, and perceived competence with regards to the AMS patient. Methods AMS simulation exercises were added to the lecture and small group case sessions in June 2010 and 2011. Simulation sessions consisted of two clinical cases using a high-fidelity full-body simulator followed by a faculty debriefing after each case. Student participation in a simulation session was voluntary. Students who did and did not participate in a simulation session completed a post-test to assess knowledge and a survey to understand comfort and perceived competence in their approach to AMS. Results A total of 154 students completed the post-test and survey and 65 (42%) attended a simulation session. Post-test scores were higher in students who attended a simulation session compared to those who did not (p<0.001). Students who participated in a simulation session were more comfortable in their overall approach to treating AMS patients (p=0.05). They were also more likely to state that they could articulate a differential diagnosis (p=0.03), know what initial diagnostic tests are needed (p=0.01), and understand what interventions are useful in the first few minutes (p=0.003). Students who participated in a simulation session were more likely to find the overall AMS curriculum useful (p<0.001). Conclusion Students who participated in a simulation exercise performed better on a knowledge-based test and reported increased comfort and perceived competence in their clinical approach to AMS. SBME shows significant promise for teaching clinical skills to medical students during pre-clinical curriculum. PMID:23561054
Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks
Walpole, J.; Chappell, J.C.; Cluceru, J.G.; Mac Gabhann, F.; Bautch, V.L.; Peirce, S. M.
2015-01-01
Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods. PMID:26158406
Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.
Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M
2015-09-01
Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods.
NASA Technical Reports Server (NTRS)
Grantham, William D.
1989-01-01
The primary objective was to provide information to the flight controls/flying qualities engineer that will assist him in determining the incremental flying qualities and/or pilot-performance differences that may be expected between results obtained via ground-based simulation (and, in particular, the six-degree-of-freedom Langley Visual/Motion Simulator (VMS)) and flight tests. Pilot opinion and performance parameters derived from a ground-based simulator and an in-flight simulator are compared for a jet-transport airplane having 32 different longitudinal dynamic response characteristics. The primary pilot tasks were the approach and landing tasks with emphasis on the landing-flare task. The results indicate that, in general, flying qualities results obtained from the ground-based simulator may be considered conservative-especially when the pilot task requires tight pilot control as during the landing flare. The one exception to this, according to the present study, was that the pilots were more tolerant of large time delays in the airplane response on the ground-based simulator. The results also indicated that the ground-based simulator (particularly the Langley VMS) is not adequate for assessing pilot/vehicle performance capabilities (i.e., the sink rate performance for the landing-flare task when the pilot has little depth/height perception from the outside scene presentation).
A Physics-Based Engineering Approach to Predict the Cross Section for Advanced SRAMs
NASA Astrophysics Data System (ADS)
Li, Lei; Zhou, Wanting; Liu, Huihua
2012-12-01
This paper presents a physics-based engineering approach to estimate the heavy ion induced upset cross section for 6T SRAM cells from layout and technology parameters. The new approach calculates the effects of radiation with junction photocurrent, which is derived based on device physics. The new and simple approach handles the problem by using simple SPICE simulations. At first, the approach uses a standard SPICE program on a typical PC to predict the SPICE-simulated curve of the collected charge vs. its affected distance from the drain-body junction with the derived junction photocurrent. And then, the SPICE-simulated curve is used to calculate the heavy ion induced upset cross section with a simple model, which considers that the SEU cross section of a SRAM cell is more related to a “radius of influence” around a heavy ion strike than to the physical size of a diffusion node in the layout for advanced SRAMs in nano-scale process technologies. The calculated upset cross section based on this method is in good agreement with the test results for 6T SRAM cells processed using 90 nm process technology.
Capoccia, Massimo; Marconi, Silvia; Singh, Sanjeet Avtaar; Pisanelli, Domenico M; De Lazzari, Claudio
2018-05-02
Modelling and simulation may become clinically applicable tools for detailed evaluation of the cardiovascular system and clinical decision-making to guide therapeutic intervention. Models based on pressure-volume relationship and zero-dimensional representation of the cardiovascular system may be a suitable choice given their simplicity and versatility. This approach has great potential for application in heart failure where the impact of left ventricular assist devices has played a significant role as a bridge to transplant and more recently as a long-term solution for non eligible candidates. We sought to investigate the value of simulation in the context of three heart failure patients with a view to predict or guide further management. CARDIOSIM © was the software used for this purpose. The study was based on retrospective analysis of haemodynamic data previously discussed at a multidisciplinary meeting. The outcome of the simulations addressed the value of a more quantitative approach in the clinical decision process. Although previous experience, co-morbidities and the risk of potentially fatal complications play a role in clinical decision-making, patient-specific modelling may become a daily approach for selection and optimisation of device-based treatment for heart failure patients. Willingness to adopt this integrated approach may be the key to further progress.
NASA Astrophysics Data System (ADS)
Cho, G. S.
2017-09-01
For performance optimization of Refrigerated Warehouses, design parameters are selected based on the physical parameters such as number of equipment and aisles, speeds of forklift for ease of modification. This paper provides a comprehensive framework approach for the system design of Refrigerated Warehouses. We propose a modeling approach which aims at the simulation optimization so as to meet required design specifications using the Design of Experiment (DOE) and analyze a simulation model using integrated aspect-oriented modeling approach (i-AOMA). As a result, this suggested method can evaluate the performance of a variety of Refrigerated Warehouses operations.
ERIC Educational Resources Information Center
Edward, Norrie S.
1997-01-01
Evaluates the importance of realism in the screen presentation of the plant in computer-based laboratory simulations for part-time engineering students. Concludes that simulations are less effective than actual laboratories but that realism minimizes the disadvantages. The schematic approach was preferred for ease of use. (AIM)
An open, object-based modeling approach for simulating subsurface heterogeneity
NASA Astrophysics Data System (ADS)
Bennett, J.; Ross, M.; Haslauer, C. P.; Cirpka, O. A.
2017-12-01
Characterization of subsurface heterogeneity with respect to hydraulic and geochemical properties is critical in hydrogeology as their spatial distribution controls groundwater flow and solute transport. Many approaches of characterizing subsurface heterogeneity do not account for well-established geological concepts about the deposition of the aquifer materials; those that do (i.e. process-based methods) often require forcing parameters that are difficult to derive from site observations. We have developed a new method for simulating subsurface heterogeneity that honors concepts of sequence stratigraphy, resolves fine-scale heterogeneity and anisotropy of distributed parameters, and resembles observed sedimentary deposits. The method implements a multi-scale hierarchical facies modeling framework based on architectural element analysis, with larger features composed of smaller sub-units. The Hydrogeological Virtual Reality simulator (HYVR) simulates distributed parameter models using an object-based approach. Input parameters are derived from observations of stratigraphic morphology in sequence type-sections. Simulation outputs can be used for generic simulations of groundwater flow and solute transport, and for the generation of three-dimensional training images needed in applications of multiple-point geostatistics. The HYVR algorithm is flexible and easy to customize. The algorithm was written in the open-source programming language Python, and is intended to form a code base for hydrogeological researchers, as well as a platform that can be further developed to suit investigators' individual needs. This presentation will encompass the conceptual background and computational methods of the HYVR algorithm, the derivation of input parameters from site characterization, and the results of groundwater flow and solute transport simulations in different depositional settings.
Simulation-based bronchoscopy training: systematic review and meta-analysis.
Kennedy, Cassie C; Maldonado, Fabien; Cook, David A
2013-07-01
Simulation-based bronchoscopy training is increasingly used, but effectiveness remains uncertain. We sought to perform a comprehensive synthesis of published work on simulation-based bronchoscopy training. We searched MEDLINE, EMBASE, CINAHL, PsycINFO, ERIC, Web of Science, and Scopus for eligible articles through May 11, 2011. We included all original studies involving health professionals that evaluated, in comparison with no intervention or an alternative instructional approach, simulation-based training for flexible or rigid bronchoscopy. Study selection and data abstraction were performed independently and in duplicate. We pooled results using random effects meta-analysis. From an initial pool of 10,903 articles, we identified 17 studies evaluating simulation-based bronchoscopy training. In comparison with no intervention, simulation training was associated with large benefits on skills and behaviors (pooled effect size, 1.21 [95% CI, 0.82-1.60]; n=8 studies) and moderate benefits on time (0.62 [95% CI, 0.12-1.13]; n=7). In comparison with clinical instruction, behaviors with real patients showed nonsignificant effects favoring simulation for time (0.61 [95% CI, -1.47 to 2.69]) and process (0.33 [95% CI, -1.46 to 2.11]) outcomes (n=2 studies each), although variation in training time might account for these differences. Four studies compared alternate simulation-based training approaches. Inductive analysis to inform instructional design suggested that longer or more structured training is more effective, authentic clinical context adds value, and animal models and plastic part-task models may be superior to more costly virtual-reality simulators. Simulation-based bronchoscopy training is effective in comparison with no intervention. Comparative effectiveness studies are few.
NASA Astrophysics Data System (ADS)
Qi, Chenkun; Gao, Feng; Zhao, Xianchao; Wang, Qian; Ren, Anye
2018-06-01
On the ground the hardware-in-the-loop (HIL) simulation is a good approach to test the contact dynamics of spacecraft docking process in space. Unfortunately, due to the time delay in the system the HIL contact simulation becomes divergent. However, the traditional first-order phase lead compensation approach still result in a small divergence for the pure time delay. The serial Smith predictor and phase lead compensation approach proposed by the authors recently will lead to an over-compensation and an obvious convergence. In this study, a hybrid Smith predictor and phase lead compensation approach is proposed. The hybrid Smith predictor and phase lead compensation can achieve a higher simulation fidelity with a little convergence. The phase angle of the compensator is analyzed and the stability condition of the HIL simulation system is given. The effectiveness of the proposed compensation approach is tested by simulations on an undamped elastic contact process.
The Business Flight Simulator.
ERIC Educational Resources Information Center
Dwyer, P.; Simpson, D.
1989-01-01
The authors describe a simulation program based on a workshop approach designed for postsecondary business students. Features and benefits of the workshop technique are discussed. The authors cover practical aspects of designing and implementing simulation workshops. (CH)
NASA Astrophysics Data System (ADS)
Thanh, Vo Hong; Marchetti, Luca; Reali, Federico; Priami, Corrado
2018-02-01
The stochastic simulation algorithm (SSA) has been widely used for simulating biochemical reaction networks. SSA is able to capture the inherently intrinsic noise of the biological system, which is due to the discreteness of species population and to the randomness of their reciprocal interactions. However, SSA does not consider other sources of heterogeneity in biochemical reaction systems, which are referred to as extrinsic noise. Here, we extend two simulation approaches, namely, the integration-based method and the rejection-based method, to take extrinsic noise into account by allowing the reaction propensities to vary in time and state dependent manner. For both methods, new efficient implementations are introduced and their efficiency and applicability to biological models are investigated. Our numerical results suggest that the rejection-based method performs better than the integration-based method when the extrinsic noise is considered.
NASA Astrophysics Data System (ADS)
Asaithambi, Sasikumar; Rajappa, Muthaiah
2018-05-01
In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.
Asaithambi, Sasikumar; Rajappa, Muthaiah
2018-05-01
In this paper, an automatic design method based on a swarm intelligence approach for CMOS analog integrated circuit (IC) design is presented. The hybrid meta-heuristics optimization technique, namely, the salp swarm algorithm (SSA), is applied to the optimal sizing of a CMOS differential amplifier and the comparator circuit. SSA is a nature-inspired optimization algorithm which mimics the navigating and hunting behavior of salp. The hybrid SSA is applied to optimize the circuit design parameters and to minimize the MOS transistor sizes. The proposed swarm intelligence approach was successfully implemented for an automatic design and optimization of CMOS analog ICs using Generic Process Design Kit (GPDK) 180 nm technology. The circuit design parameters and design specifications are validated through a simulation program for integrated circuit emphasis simulator. To investigate the efficiency of the proposed approach, comparisons have been carried out with other simulation-based circuit design methods. The performances of hybrid SSA based CMOS analog IC designs are better than the previously reported studies.
NASA Technical Reports Server (NTRS)
Pahr, D. H.; Arnold, S. M.
2001-01-01
The paper begins with a short overview of the recent work done in the field of discontinuous reinforced composites, focusing on the different parameters which influence the material behavior of discontinuous reinforced composites, as well as the various analysis approaches undertaken. Based on this overview it became evident, that in order to investigate the enumerated effects in an efficient and comprehensive manner, an alternative approach to the computationally intensive finite-element based micromechanics approach is required. Therefore, an investigation is conducted to demonstrate the utility of utilizing the generalized method of cells (GMC), a semi-analytical micromechanics-based approach, to simulate the elastic and elastoplastic material behavior of aligned short fiber composites. The results are compared with (1) simulations using other micromechanical based mean field models and finite element (FE) unit cell models found in the literature given elastic material behavior, as well as (2) finite element unit cell and a new semianalytical elastoplastic shear lag model in the inelastic range. GMC is shown to definitely have a window of applicability when simulating discontinuously reinforced composite material behavior.
NASA Technical Reports Server (NTRS)
Pahr, D. H.; Arnold, S. M.
2001-01-01
The paper begins with a short overview of the recent work done in the field of discontinuous reinforced composites, focusing on the different parameters which influence the material behavior of discontinuous reinforced composites, as well as the various analysis approaches undertaken. Based on this overview it became evident that in order to investigate the enumerated effects in an efficient and comprehensive manner, an alternative approach to the computationally intensive finite-element based micromechanics approach is required. Therefore, an investigation is conducted to demonstrate the utility of utilizing the generalized method of cells (GMC), a semi-analytical micromechanics-based approach, to simulate the elastic and elastoplastic material behavior of aligned short fiber composites. The results are compared with simulations using other micromechanical based mean field models and finite element (FE) unit cell models found in the literature given elastic material behavior, as well as finite element unit cell and a new semianalytical elastoplastic shear lag model in the inelastic range. GMC is shown to definitely have a window of applicability when simulating discontinuously reinforced composite material behavior.
A Systems Approach to Scalable Transportation Network Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumalla, Kalyan S
2006-01-01
Emerging needs in transportation network modeling and simulation are raising new challenges with respect to scal-ability of network size and vehicular traffic intensity, speed of simulation for simulation-based optimization, and fidel-ity of vehicular behavior for accurate capture of event phe-nomena. Parallel execution is warranted to sustain the re-quired detail, size and speed. However, few parallel simulators exist for such applications, partly due to the challenges underlying their development. Moreover, many simulators are based on time-stepped models, which can be computationally inefficient for the purposes of modeling evacuation traffic. Here an approach is presented to de-signing a simulator with memory andmore » speed efficiency as the goals from the outset, and, specifically, scalability via parallel execution. The design makes use of discrete event modeling techniques as well as parallel simulation meth-ods. Our simulator, called SCATTER, is being developed, incorporating such design considerations. Preliminary per-formance results are presented on benchmark road net-works, showing scalability to one million vehicles simu-lated on one processor.« less
Alfred, Michael; Chung, Christopher A
2012-12-01
This paper describes a second generation Simulator for Engineering Ethics Education. Details describing the first generation activities of this overall effort are published in Chung and Alfred (Sci Eng Ethics 15:189-199, 2009). The second generation research effort represents a major development in the interactive simulator educational approach. As with the first generation effort, the simulator places students in first person perspective scenarios involving different types of ethical situations. Students must still gather data, assess the situation, and make decisions. The approach still requires students to develop their own ability to identify and respond to ethical engineering situations. However, were as, the generation one effort involved the use of a dogmatic model based on National Society of Professional Engineers' Code of Ethics, the new generation two model is based on a mathematical model of the actual experiences of engineers involved in ethical situations. This approach also allows the use of feedback in the form of decision effectiveness and professional career impact. Statistical comparisons indicate a 59 percent increase in overall knowledge and a 19 percent improvement in teaching effectiveness over an Internet Engineering Ethics resource based approach.
Fusion of GEDI, ICESAT2 & NISAR data for above ground biomass mapping in Sonoma County, California
NASA Astrophysics Data System (ADS)
Duncanson, L.; Simard, M.; Thomas, N. M.; Neuenschwander, A. L.; Hancock, S.; Armston, J.; Dubayah, R.; Hofton, M. A.; Huang, W.; Tang, H.; Marselis, S.; Fatoyinbo, T.
2017-12-01
Several upcoming NASA missions will collect data sensitive to forest structure (GEDI, ICESAT-2 & NISAR). The LiDAR and SAR data collected by these missions will be used in coming years to map forest aboveground biomass at various resolutions. This research focuses on developing and testing multi-sensor data fusion approaches in advance of these missions. Here, we present the first case study of a CMS-16 grant with results from Sonoma County, California. We simulate lidar and SAR datasets from GEDI, ICESAT-2 and NISAR using airborne discrete return lidar and UAVSAR data, respectively. GEDI and ICESAT-2 signals are simulated from high point density discrete return lidar that was acquired over the entire county in 2014 through a previous CMS project (Dubayah & Hurtt, CMS-13). NISAR is simulated from L-band UAVSAR data collected in 2014. These simulations are empirically related to 300 field plots of aboveground biomass as well as a 30m biomass map produced from the 2014 airborne lidar data. We model biomass independently for each simulated mission dataset and then test two fusion methods for County-wide mapping 1) a pixel based approach and 2) an object oriented approach. In the pixel-based approach, GEDI and ICESAT-2 biomass models are calibrated over field plots and applied in orbital simulations for a 2-year period of the GEDI and ICESAT-2 missions. These simulated samples are then used to calibrate UAVSAR data to produce a 0.25 ha map. In the object oriented approach, the GEDI and ICESAT-2 data are identical to the pixel-based approach, but calibrate image objects of similar L-band backscatter rather than uniform pixels. The results of this research demonstrate the estimated ability for each of these three missions to independently map biomass in a temperate, high biomass system, as well as the potential improvement expected through combining mission datasets.
Modeling languages for biochemical network simulation: reaction vs equation based approaches.
Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya
2010-01-01
Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.
Accuracy of buffered-force QM/MM simulations of silica
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peguiron, Anke; Moras, Gianpietro; Colombi Ciacchi, Lucio
2015-02-14
We report comparisons between energy-based quantum mechanics/molecular mechanics (QM/MM) and buffered force-based QM/MM simulations in silica. Local quantities—such as density of states, charges, forces, and geometries—calculated with both QM/MM approaches are compared to the results of full QM simulations. We find the length scale over which forces computed using a finite QM region converge to reference values obtained in full quantum-mechanical calculations is ∼10 Å rather than the ∼5 Å previously reported for covalent materials such as silicon. Electrostatic embedding of the QM region in the surrounding classical point charges gives only a minor contribution to the force convergence. Whilemore » the energy-based approach provides accurate results in geometry optimizations of point defects, we find that the removal of large force errors at the QM/MM boundary provided by the buffered force-based scheme is necessary for accurate constrained geometry optimizations where Si–O bonds are elongated and for finite-temperature molecular dynamics simulations of crack propagation. Moreover, the buffered approach allows for more flexibility, since special-purpose QM/MM coupling terms that link QM and MM atoms are not required and the region that is treated at the QM level can be adaptively redefined during the course of a dynamical simulation.« less
ERIC Educational Resources Information Center
Zitzelsberger, Hilde; Coffey, Sue; Graham, Leslie; Papaconstantinou, Efrosini; Anyinam, Charles
2017-01-01
Simulation-based learning (SBL) is rapidly becoming one of the most significant teaching-learning-evaluation strategies available in undergraduate nursing education. While there is indication within the literature and anecdotally about the benefits of simulation, abundant and strong evidence that supports the effectiveness of simulation for…
Ahmed, Aqeel; Rippmann, Friedrich; Barnickel, Gerhard; Gohlke, Holger
2011-07-25
A three-step approach for multiscale modeling of protein conformational changes is presented that incorporates information about preferred directions of protein motions into a geometric simulation algorithm. The first two steps are based on a rigid cluster normal-mode analysis (RCNMA). Low-frequency normal modes are used in the third step (NMSim) to extend the recently introduced idea of constrained geometric simulations of diffusive motions in proteins by biasing backbone motions of the protein, whereas side-chain motions are biased toward favorable rotamer states. The generated structures are iteratively corrected regarding steric clashes and stereochemical constraint violations. The approach allows performing three simulation types: unbiased exploration of conformational space; pathway generation by a targeted simulation; and radius of gyration-guided simulation. When applied to a data set of proteins with experimentally observed conformational changes, conformational variabilities are reproduced very well for 4 out of 5 proteins that show domain motions, with correlation coefficients r > 0.70 and as high as r = 0.92 in the case of adenylate kinase. In 7 out of 8 cases, NMSim simulations starting from unbound structures are able to sample conformations that are similar (root-mean-square deviation = 1.0-3.1 Å) to ligand bound conformations. An NMSim generated pathway of conformational change of adenylate kinase correctly describes the sequence of domain closing. The NMSim approach is a computationally efficient alternative to molecular dynamics simulations for conformational sampling of proteins. The generated conformations and pathways of conformational transitions can serve as input to docking approaches or as starting points for more sophisticated sampling techniques.
4D dose simulation in volumetric arc therapy: Accuracy and affecting parameters
Werner, René
2017-01-01
Radiotherapy of lung and liver lesions has changed from normofractioned 3D-CRT to stereotactic treatment in a single or few fractions, often employing volumetric arc therapy (VMAT)-based techniques. Potential unintended interference of respiratory target motion and dynamically changing beam parameters during VMAT dose delivery motivates establishing 4D quality assurance (4D QA) procedures to assess appropriateness of generated VMAT treatment plans when taking into account patient-specific motion characteristics. Current approaches are motion phantom-based 4D QA and image-based 4D VMAT dose simulation. Whereas phantom-based 4D QA is usually restricted to a small number of measurements, the computational approaches allow simulating many motion scenarios. However, 4D VMAT dose simulation depends on various input parameters, influencing estimated doses along with mitigating simulation reliability. Thus, aiming at routine use of simulation-based 4D VMAT QA, the impact of such parameters as well as the overall accuracy of the 4D VMAT dose simulation has to be studied in detail–which is the topic of the present work. In detail, we introduce the principles of 4D VMAT dose simulation, identify influencing parameters and assess their impact on 4D dose simulation accuracy by comparison of simulated motion-affected dose distributions to corresponding dosimetric motion phantom measurements. Exploiting an ITV-based treatment planning approach, VMAT treatment plans were generated for a motion phantom and different motion scenarios (sinusoidal motion of different period/direction; regular/irregular motion). 4D VMAT dose simulation results and dose measurements were compared by local 3% / 3 mm γ-evaluation, with the measured dose distributions serving as ground truth. Overall γ-passing rates of simulations and dynamic measurements ranged from 97% to 100% (mean across all motion scenarios: 98% ± 1%); corresponding values for comparison of different day repeat measurements were between 98% and 100%. Parameters of major influence on 4D VMAT dose simulation accuracy were the degree of temporal discretization of the dose delivery process (the higher, the better) and correct alignment of the assumed breathing phases at the beginning of the dose measurements and simulations. Given the high γ-passing rates between simulated motion-affected doses and dynamic measurements, we consider the simulations to provide a reliable basis for assessment of VMAT motion effects that–in the sense of 4D QA of VMAT treatment plans–allows to verify target coverage in hypofractioned VMAT-based radiotherapy of moving targets. Remaining differences between measurements and simulations motivate, however, further detailed studies. PMID:28231337
4D dose simulation in volumetric arc therapy: Accuracy and affecting parameters.
Sothmann, Thilo; Gauer, Tobias; Werner, René
2017-01-01
Radiotherapy of lung and liver lesions has changed from normofractioned 3D-CRT to stereotactic treatment in a single or few fractions, often employing volumetric arc therapy (VMAT)-based techniques. Potential unintended interference of respiratory target motion and dynamically changing beam parameters during VMAT dose delivery motivates establishing 4D quality assurance (4D QA) procedures to assess appropriateness of generated VMAT treatment plans when taking into account patient-specific motion characteristics. Current approaches are motion phantom-based 4D QA and image-based 4D VMAT dose simulation. Whereas phantom-based 4D QA is usually restricted to a small number of measurements, the computational approaches allow simulating many motion scenarios. However, 4D VMAT dose simulation depends on various input parameters, influencing estimated doses along with mitigating simulation reliability. Thus, aiming at routine use of simulation-based 4D VMAT QA, the impact of such parameters as well as the overall accuracy of the 4D VMAT dose simulation has to be studied in detail-which is the topic of the present work. In detail, we introduce the principles of 4D VMAT dose simulation, identify influencing parameters and assess their impact on 4D dose simulation accuracy by comparison of simulated motion-affected dose distributions to corresponding dosimetric motion phantom measurements. Exploiting an ITV-based treatment planning approach, VMAT treatment plans were generated for a motion phantom and different motion scenarios (sinusoidal motion of different period/direction; regular/irregular motion). 4D VMAT dose simulation results and dose measurements were compared by local 3% / 3 mm γ-evaluation, with the measured dose distributions serving as ground truth. Overall γ-passing rates of simulations and dynamic measurements ranged from 97% to 100% (mean across all motion scenarios: 98% ± 1%); corresponding values for comparison of different day repeat measurements were between 98% and 100%. Parameters of major influence on 4D VMAT dose simulation accuracy were the degree of temporal discretization of the dose delivery process (the higher, the better) and correct alignment of the assumed breathing phases at the beginning of the dose measurements and simulations. Given the high γ-passing rates between simulated motion-affected doses and dynamic measurements, we consider the simulations to provide a reliable basis for assessment of VMAT motion effects that-in the sense of 4D QA of VMAT treatment plans-allows to verify target coverage in hypofractioned VMAT-based radiotherapy of moving targets. Remaining differences between measurements and simulations motivate, however, further detailed studies.
Cárdenas-García, Maura; González-Pérez, Pedro Pablo
2013-03-01
Apoptotic cell death plays a crucial role in development and homeostasis. This process is driven by mitochondrial permeabilization and activation of caspases. In this paper we adopt a tuple spaces-based modelling and simulation approach, and show how it can be applied to the simulation of this intracellular signalling pathway. Specifically, we are working to explore and to understand the complex interaction patterns of the caspases apoptotic and the mitochondrial role. As a first approximation, using the tuple spacesbased in silico approach, we model and simulate both the extrinsic and intrinsic apoptotic signalling pathways and the interactions between them. During apoptosis, mitochondrial proteins, released from mitochondria to cytosol are decisively involved in the process. If the decision is to die, from this point there is normally no return, cancer cells offer resistance to the mitochondrial induction.
Evaluation of a grid based molecular dynamics approach for polypeptide simulations.
Merelli, Ivan; Morra, Giulia; Milanesi, Luciano
2007-09-01
Molecular dynamics is very important for biomedical research because it makes possible simulation of the behavior of a biological macromolecule in silico. However, molecular dynamics is computationally rather expensive: the simulation of some nanoseconds of dynamics for a large macromolecule such as a protein takes very long time, due to the high number of operations that are needed for solving the Newton's equations in the case of a system of thousands of atoms. In order to obtain biologically significant data, it is desirable to use high-performance computation resources to perform these simulations. Recently, a distributed computing approach based on replacing a single long simulation with many independent short trajectories has been introduced, which in many cases provides valuable results. This study concerns the development of an infrastructure to run molecular dynamics simulations on a grid platform in a distributed way. The implemented software allows the parallel submission of different simulations that are singularly short but together bring important biological information. Moreover, each simulation is divided into a chain of jobs to avoid data loss in case of system failure and to contain the dimension of each data transfer from the grid. The results confirm that the distributed approach on grid computing is particularly suitable for molecular dynamics simulations thanks to the elevated scalability.
Manned remote work station development article
NASA Technical Reports Server (NTRS)
1978-01-01
The two prime objectives of the Manned Remote Work Station (MRWS) Development Article Study are to first, evaluate the MRWS flight article roles and associated design concepts for fundamental requirements and embody key technology developments into a simulation program; and to provide detail manufacturing drawings and schedules for a simulator development test article. An approach is outlined which establishes flight article requirements based on past studies of Solar Power Satellite, orbital construction support equipments, construction bases and near term shuttle operations. Simulation objectives are established for those technology issues that can best be addressed on a simulator. Concepts for full-scale and sub-scale simulators are then studied to establish an overall approach to studying MRWS requirements. Emphasis then shifts to design and specification of a full-scale development test article.
Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo
2012-01-01
In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.
Micromotors to capture and destroy anthrax simulant spores.
Orozco, Jahir; Pan, Guoqing; Sattayasamitsathit, Sirilak; Galarnyk, Michael; Wang, Joseph
2015-03-07
Towards addressing the need for detecting and eliminating biothreats, we describe a micromotor-based approach for screening, capturing, isolating and destroying anthrax simulant spores in a simple and rapid manner with minimal sample processing. The B. globilli antibody-functionalized micromotors can recognize, capture and transport B. globigii spores in environmental matrices, while showing non-interactions with excess of non-target bacteria. Efficient destruction of the anthrax simulant spores is demonstrated via the micromotor-induced mixing of a mild oxidizing solution. The new micromotor-based approach paves a way to dynamic multifunctional systems that rapidly recognize, isolate, capture and destroy biological threats.
SimZones: An Organizational Innovation for Simulation Programs and Centers.
Roussin, Christopher J; Weinstock, Peter
2017-08-01
The complexity and volume of simulation-based learning programs have increased dramatically over the last decade, presenting several major challenges for those who lead and manage simulation programs and centers. The authors present five major issues affecting the organization of simulation programs: (1) supporting both single- and double-loop learning experiences; (2) managing the training of simulation teaching faculty; (3) optimizing the participant mix, including individuals, professional groups, teams, and other role-players, to ensure learning; (4) balancing in situ, node-based, and center-based simulation delivery; and (5) organizing simulation research and measuring value. They then introduce the SimZones innovation, a system of organization for simulation-based learning, and explain how it can alleviate the problems associated with these five issues.Simulations are divided into four zones (Zones 0-3). Zone 0 simulations include autofeedback exercises typically practiced by solitary learners, often using virtual simulation technology. Zone 1 simulations include hands-on instruction of foundational clinical skills. Zone 2 simulations include acute situational instruction, such as clinical mock codes. Zone 3 simulations involve authentic, native teams of participants and facilitate team and system development.The authors also discuss the translation of debriefing methods from Zone 3 simulations to real patient care settings (Zone 4), and they illustrate how the SimZones approach can enable the development of longitudinal learning systems in both teaching and nonteaching hospitals. The SimZones approach was initially developed in the context of the Boston Children's Hospital Simulator Program, which the authors use to illustrate this innovation in action.
An ultra-low-cost moving-base driving simulator
DOT National Transportation Integrated Search
2001-11-04
A novel approach to driving simulation is described, one that potentially overcomes the limitations of both motion fidelity and cost. It has become feasible only because of recent advances in computer-based image generation speed and fidelity and in ...
NASA Astrophysics Data System (ADS)
Zhang, Rong-Hua; Tao, Ling-Jiang; Gao, Chuan
2017-09-01
Large uncertainties exist in real-time predictions of the 2015 El Niño event, which have systematic intensity biases that are strongly model-dependent. It is critically important to characterize those model biases so they can be reduced appropriately. In this study, the conditional nonlinear optimal perturbation (CNOP)-based approach was applied to an intermediate coupled model (ICM) equipped with a four-dimensional variational data assimilation technique. The CNOP-based approach was used to quantify prediction errors that can be attributed to initial conditions (ICs) and model parameters (MPs). Two key MPs were considered in the ICM: one represents the intensity of the thermocline effect, and the other represents the relative coupling intensity between the ocean and atmosphere. Two experiments were performed to illustrate the effects of error corrections, one with a standard simulation and another with an optimized simulation in which errors in the ICs and MPs derived from the CNOP-based approach were optimally corrected. The results indicate that simulations of the 2015 El Niño event can be effectively improved by using CNOP-derived error correcting. In particular, the El Niño intensity in late 2015 was adequately captured when simulations were started from early 2015. Quantitatively, the Niño3.4 SST index simulated in Dec. 2015 increased to 2.8 °C in the optimized simulation, compared with only 1.5 °C in the standard simulation. The feasibility and effectiveness of using the CNOP-based technique to improve ENSO simulations are demonstrated in the context of the 2015 El Niño event. The limitations and further applications are also discussed.
Flight simulator fidelity assessment in a rotorcraft lateral translation maneuver
NASA Technical Reports Server (NTRS)
Hess, R. A.; Malsbury, T.; Atencio, A., Jr.
1992-01-01
A model-based methodology for assessing flight simulator fidelity in closed-loop fashion is exercised in analyzing a rotorcraft low-altitude maneuver for which flight test and simulation results were available. The addition of a handling qualities sensitivity function to a previously developed model-based assessment criteria allows an analytical comparison of both performance and handling qualities between simulation and flight test. Model predictions regarding the existence of simulator fidelity problems are corroborated by experiment. The modeling approach is used to assess analytically the effects of modifying simulator characteristics on simulator fidelity.
ERIC Educational Resources Information Center
Chadli, Abdelhafid; Bendella, Fatima; Tranvouez, Erwan
2015-01-01
In this paper we present an Agent-based evaluation approach in a context of Multi-agent simulation learning systems. Our evaluation model is based on a two stage assessment approach: (1) a Distributed skill evaluation combining agents and fuzzy sets theory; and (2) a Negotiation based evaluation of students' performance during a training…
Enhancing Student Engagement through Simulation in Programming Sessions
ERIC Educational Resources Information Center
Isiaq, Sakirulai Olufemi; Jamil, Md Golam
2018-01-01
Purpose: The purpose of this paper is to explore the use of a simulator for teaching programming to foster student engagement and meaningful learning. Design/methodology/approach: An exploratory mixed-method research approach was adopted in a classroom-based environment at a UK university. A rich account of student engagement dimensions…
A novel approach for connecting temporal-ontologies with blood flow simulations.
Weichert, F; Mertens, C; Walczak, L; Kern-Isberner, G; Wagner, M
2013-06-01
In this paper an approach for developing a temporal domain ontology for biomedical simulations is introduced. The ideas are presented in the context of simulations of blood flow in aneurysms using the Lattice Boltzmann Method. The advantages in using ontologies are manyfold: On the one hand, ontologies having been proven to be able to provide medical special knowledge e.g., key parameters for simulations. On the other hand, based on a set of rules and the usage of a reasoner, a system for checking the plausibility as well as tracking the outcome of medical simulations can be constructed. Likewise, results of simulations including data derived from them can be stored and communicated in a way that can be understood by computers. Later on, this set of results can be analyzed. At the same time, the ontologies provide a way to exchange knowledge between researchers. Lastly, this approach can be seen as a black-box abstraction of the internals of the simulation for the biomedical researcher as well. This approach is able to provide the complete parameter sets for simulations, part of the corresponding results and part of their analysis as well as e.g., geometry and boundary conditions. These inputs can be transferred to different simulation methods for comparison. Variations on the provided parameters can be automatically used to drive these simulations. Using a rule base, unphysical inputs or outputs of the simulation can be detected and communicated to the physician in a suitable and familiar way. An example for an instantiation of the blood flow simulation ontology and exemplary rules for plausibility checking are given. Copyright © 2013 Elsevier Inc. All rights reserved.
A methodological, task-based approach to Procedure-Specific Simulations training.
Setty, Yaki; Salzman, Oren
2016-12-01
Procedure-Specific Simulations (PSS) are 3D realistic simulations that provide a platform to practice complete surgical procedures in a virtual-reality environment. While PSS have the potential to improve surgeons' proficiency, there are no existing standards or guidelines for PSS development in a structured manner. We employ a unique platform inspired by game design to develop virtual reality simulations in three dimensions of urethrovesical anastomosis during radical prostatectomy. 3D visualization is supported by a stereo vision, providing a fully realistic view of the simulation. The software can be executed for any robotic surgery platform. Specifically, we tested the simulation under windows environment on the RobotiX Mentor. Using urethrovesical anastomosis during radical prostatectomy simulation as a representative example, we present a task-based methodological approach to PSS training. The methodology provides tasks in increasing levels of difficulty from a novice level of basic anatomy identification, to an expert level that permits testing new surgical approaches. The modular methodology presented here can be easily extended to support more complex tasks. We foresee this methodology as a tool used to integrate PSS as a complementary training process for surgical procedures.
Integration of visual and motion cues for simulator requirements and ride quality investigation
NASA Technical Reports Server (NTRS)
Young, L. R.
1976-01-01
Practical tools which can extend the state of the art of moving base flight simulation for research and training are developed. Main approaches to this research effort include: (1) application of the vestibular model for perception of orientation based on motion cues: optimum simulator motion controls; and (2) visual cues in landing.
Tennant, Marc; Kruger, Estie
2013-02-01
This study developed a Monte Carlo simulation approach to examining the prevalence and incidence of dental decay using Australian children as a test environment. Monte Carlo simulation has been used for a half a century in particle physics (and elsewhere); put simply, it is the probability for various population-level outcomes seeded randomly to drive the production of individual level data. A total of five runs of the simulation model for all 275,000 12-year-olds in Australia were completed based on 2005-2006 data. Measured on average decayed/missing/filled teeth (DMFT) and DMFT of highest 10% of sample (Sic10) the runs did not differ from each other by more than 2% and the outcome was within 5% of the reported sampled population data. The simulations rested on the population probabilities that are known to be strongly linked to dental decay, namely, socio-economic status and Indigenous heritage. Testing the simulated population found DMFT of all cases where DMFT<>0 was 2.3 (n = 128,609) and DMFT for Indigenous cases only was 1.9 (n = 13,749). In the simulation population the Sic25 was 3.3 (n = 68,750). Monte Carlo simulations were created in particle physics as a computational mathematical approach to unknown individual-level effects by resting a simulation on known population-level probabilities. In this study a Monte Carlo simulation approach to childhood dental decay was built, tested and validated. © 2013 FDI World Dental Federation.
Models and Methods for Adaptive Management of Individual and Team-Based Training Using a Simulator
NASA Astrophysics Data System (ADS)
Lisitsyna, L. S.; Smetyuh, N. P.; Golikov, S. P.
2017-05-01
Research of adaptive individual and team-based training has been analyzed and helped find out that both in Russia and abroad, individual and team-based training and retraining of AASTM operators usually includes: production training, training of general computer and office equipment skills, simulator training including virtual simulators which use computers to simulate real-world manufacturing situation, and, as a rule, the evaluation of AASTM operators’ knowledge determined by completeness and adequacy of their actions under the simulated conditions. Such approach to training and re-training of AASTM operators stipulates only technical training of operators and testing their knowledge based on assessing their actions in a simulated environment.
NASA Astrophysics Data System (ADS)
Dörr, Dominik; Faisst, Markus; Joppich, Tobias; Poppe, Christian; Henning, Frank; Kärger, Luise
2018-05-01
Finite Element (FE) forming simulation offers the possibility of a detailed analysis of thermoforming processes by means of constitutive modelling of intra- and inter-ply deformation mechanisms, which makes manufacturing defects predictable. Inter-ply slippage is a deformation mechanism, which influences the forming behaviour and which is usually assumed to be isotropic in FE forming simulation so far. Thus, the relative (fibre) orientation between the slipping plies is neglected for modelling of frictional behaviour. Characterization results, however, reveal a dependency of frictional behaviour on the relative orientation of the slipping plies. In this work, an anisotropic model for inter-ply slippage is presented, which is based on an FE forming simulation approach implemented within several user subroutines of the commercially available FE solver Abaqus. This approach accounts for the relative orientation between the slipping plies for modelling frictional behaviour. For this purpose, relative orientation of the slipping plies is consecutively evaluated, since it changes during forming due to inter-ply slipping and intra-ply shearing. The presented approach is parametrized based on characterization results with and without relative orientation for a thermoplastic UD-tape (PA6-CF) and applied to forming simulation of a generic geometry. Forming simulation results reveal an influence of the consideration of relative fibre orientation on the simulation results. This influence, however, is small for the considered geometry.
NASA Technical Reports Server (NTRS)
Wing, David J.; Prevot, Thomas; Murdoch, Jennifer L.; Cabrall, Christopher D.; Homola, Jeffrey R.; Martin, Lynne H.; Mercer, Joey S.; Hoadley, Sherwood T.; Wilson, Sara R.; Hubbs, Clay E.;
2010-01-01
This paper presents an air/ground functional allocation experiment conducted by the National Aeronautics and Space Administration (NASA) using two human-in-the-Loop simulations to compare airborne and ground-based approaches to NextGen separation assurance. The approaches under investigation are two trajectory-based four-dimensional (4D) concepts; one referred to as "airborne trajectory management with self-separation" (airborne) the other as "ground-based automated separation assurance" (ground-based). In coordinated simulations at NASA's Ames and Langley Research Centers, the primary operational participants -controllers for the ground-based concept and pilots for the airborne concept - manage the same traffic scenario using the two different 4D concepts. The common scenarios are anchored in traffic problems that require a significant increase in airspace capacity - on average, double, and in some local areas, close to 250% over current day levels - in order to enable aircraft to safely and efficiently traverse the test airspace. The simulations vary common independent variables such as traffic density, sequencing and scheduling constraints, and timing of trajectory change events. A set of common metrics is collected to enable a direct comparison of relevant results. The simulations will be conducted in spring 2010. If accepted, this paper will be the first publication of the experimental approach and early results. An initial comparison of safety and efficiency as well as operator acceptability under the two concepts is expected.
NASA Astrophysics Data System (ADS)
Deng, Shaohui; Wang, Xiaoling; Yu, Jia; Zhang, Yichi; Liu, Zhen; Zhu, Yushan
2018-06-01
Grouting plays a crucial role in dam safety. Due to the concealment of grouting activities, complexity of fracture distribution in rock masses and rheological properties of cement grout, it is difficult to analyze the effects of grouting. In this paper, a computational fluid dynamics (CFD) simulation approach of dam foundation grouting based on a 3D fracture network model is proposed. In this approach, the 3D fracture network model, which is based on an improved bootstrap sampling method and established by VisualGeo software, can provide a reliable and accurate geometric model for CFD simulation of dam foundation grouting. Based on the model, a CFD simulation is performed, in which the Papanastasiou regularized model is used to express the grout rheological properties, and the volume of fluid technique is utilized to capture the grout fronts. Two sets of tests are performed to verify the effectiveness of the Papanastasiou regularized model. When applying the CFD simulation approach for dam foundation grouting, three technical issues can be solved: (1) collapsing potential of the fracture samples, (2) inconsistencies in the geometric model in actual fractures under complex geological conditions, and (3) inappropriate method of characterizing the rheological properties of cement grout. The applicability of the proposed approach is demonstrated by an illustrative case study—a hydropower station dam foundation in southwestern China.
Martínez-Fernández, L; Pepino, A J; Segarra-Martí, J; Banyasz, A; Garavelli, M; Improta, R
2016-09-13
The optical spectra of 5-methylcytidine in three different solvents (tetrahydrofuran, acetonitrile, and water) is measured, showing that both the absorption and the emission maximum in water are significantly blue-shifted (0.08 eV). The absorption spectra are simulated based on CAM-B3LYP/TD-DFT calculations but including solvent effects with three different approaches: (i) a hybrid implicit/explicit full quantum mechanical approach, (ii) a mixed QM/MM static approach, and (iii) a QM/MM method exploiting the structures issuing from molecular dynamics classical simulations. Ab-initio Molecular dynamics simulations based on CAM-B3LYP functionals have also been performed. The adopted approaches all reproduce the main features of the experimental spectra, giving insights on the chemical-physical effects responsible for the solvent shifts in the spectra of 5-methylcytidine and providing the basis for discussing advantages and limitations of the adopted solvation models.
Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru
2009-04-27
Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.
Virtual Learning. A Revolutionary Approach to Building a Highly Skilled Workforce.
ERIC Educational Resources Information Center
Schank, Roger
This book offers trainers and human resource managers an alternative approach to train people more effectively and capitalize on multimedia-based tools. The approach is based on computer-based training and virtual learning theory. Chapter 1 discusses how to remedy problems caused by bad training. Chapter 2 focuses on simulating work and creating…
NASA Astrophysics Data System (ADS)
Sothmann, T.; Gauer, T.; Wilms, M.; Werner, R.
2017-12-01
The purpose of this study is to introduce a novel approach to incorporate patient-specific breathing variability information into 4D dose simulation of volumetric arc therapy (VMAT)-based stereotactic body radiotherapy (SBRT) of extracranial metastases. Feasibility of the approach is illustrated by application to treatment planning and motion data of lung and liver metastasis patients. The novel 4D dose simulation approach makes use of a regression-based correspondence model that allows representing patient motion variability by breathing signal-steered interpolation and extrapolation of deformable image registration motion fields. To predict the internal patient motion during treatment with only external breathing signal measurements being available, the patients’ internal motion information and external breathing signals acquired during 4D CT imaging were correlated. Combining the correspondence model, patient-specific breathing signal measurements during treatment and time-resolved information about dose delivery, reconstruction of a motion variability-affected dose becomes possible. As a proof of concept, the proposed approach is illustrated by a retrospective 4D simulation of VMAT-based SBRT treatment of ten patients with 15 treated lung and liver metastases and known clinical endpoints for the individual metastases (local metastasis recurrence yes/no). Resulting 4D-simulated dose distributions were compared to motion-affected dose distributions estimated by standard 4D CT-only dose accumulation and the originally (i.e. statically) planned dose distributions by means of GTV D98 indices (dose to 98% of the GTV volume). A potential linkage of metastasis-specific endpoints to differences between GTV D98 indices of planned and 4D-simulated dose distributions was analyzed.
Hulme, Adam; Thompson, Jason; Nielsen, Rasmus Oestergaard; Read, Gemma J M; Salmon, Paul M
2018-06-18
There have been recent calls for the application of the complex systems approach in sports injury research. However, beyond theoretical description and static models of complexity, little progress has been made towards formalising this approach in way that is practical to sports injury scientists and clinicians. Therefore, our objective was to use a computational modelling method and develop a dynamic simulation in sports injury research. Agent-based modelling (ABM) was used to model the occurrence of sports injury in a synthetic athlete population. The ABM was developed based on sports injury causal frameworks and was applied in the context of distance running-related injury (RRI). Using the acute:chronic workload ratio (ACWR), we simulated the dynamic relationship between changes in weekly running distance and RRI through the manipulation of various 'athlete management tools'. The findings confirmed that building weekly running distances over time, even within the reported ACWR 'sweet spot', will eventually result in RRI as athletes reach and surpass their individual physical workload limits. Introducing training-related error into the simulation and the modelling of a 'hard ceiling' dynamic resulted in a higher RRI incidence proportion across the population at higher absolute workloads. The presented simulation offers a practical starting point to further apply more sophisticated computational models that can account for the complex nature of sports injury aetiology. Alongside traditional forms of scientific inquiry, the use of ABM and other simulation-based techniques could be considered as a complementary and alternative methodological approach in sports injury research. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Persano Adorno, Dominique; Pizzolato, Nicola; Fazio, Claudio
2015-09-01
Within the context of higher education for science or engineering undergraduates, we present an inquiry-driven learning path aimed at developing a more meaningful conceptual understanding of the electron dynamics in semiconductors in the presence of applied electric fields. The electron transport in a nondegenerate n-type indium phosphide bulk semiconductor is modelled using a multivalley Monte Carlo approach. The main characteristics of the electron dynamics are explored under different values of the driving electric field, lattice temperature and impurity density. Simulation results are presented by following a question-driven path of exploration, starting from the validation of the model and moving up to reasoned inquiries about the observed characteristics of electron dynamics. Our inquiry-driven learning path, based on numerical simulations, represents a viable example of how to integrate a traditional lecture-based teaching approach with effective learning strategies, providing science or engineering undergraduates with practical opportunities to enhance their comprehension of the physics governing the electron dynamics in semiconductors. Finally, we present a general discussion about the advantages and disadvantages of using an inquiry-based teaching approach within a learning environment based on semiconductor simulations.
Simulation-Based Bronchoscopy Training
Kennedy, Cassie C.; Maldonado, Fabien
2013-01-01
Background: Simulation-based bronchoscopy training is increasingly used, but effectiveness remains uncertain. We sought to perform a comprehensive synthesis of published work on simulation-based bronchoscopy training. Methods: We searched MEDLINE, EMBASE, CINAHL, PsycINFO, ERIC, Web of Science, and Scopus for eligible articles through May 11, 2011. We included all original studies involving health professionals that evaluated, in comparison with no intervention or an alternative instructional approach, simulation-based training for flexible or rigid bronchoscopy. Study selection and data abstraction were performed independently and in duplicate. We pooled results using random effects meta-analysis. Results: From an initial pool of 10,903 articles, we identified 17 studies evaluating simulation-based bronchoscopy training. In comparison with no intervention, simulation training was associated with large benefits on skills and behaviors (pooled effect size, 1.21 [95% CI, 0.82-1.60]; n = 8 studies) and moderate benefits on time (0.62 [95% CI, 0.12-1.13]; n = 7). In comparison with clinical instruction, behaviors with real patients showed nonsignificant effects favoring simulation for time (0.61 [95% CI, −1.47 to 2.69]) and process (0.33 [95% CI, −1.46 to 2.11]) outcomes (n = 2 studies each), although variation in training time might account for these differences. Four studies compared alternate simulation-based training approaches. Inductive analysis to inform instructional design suggested that longer or more structured training is more effective, authentic clinical context adds value, and animal models and plastic part-task models may be superior to more costly virtual-reality simulators. Conclusions: Simulation-based bronchoscopy training is effective in comparison with no intervention. Comparative effectiveness studies are few. PMID:23370487
A motion-constraint logic for moving-base simulators based on variable filter parameters
NASA Technical Reports Server (NTRS)
Miller, G. K., Jr.
1974-01-01
A motion-constraint logic for moving-base simulators has been developed that is a modification to the linear second-order filters generally employed in conventional constraints. In the modified constraint logic, the filter parameters are not constant but vary with the instantaneous motion-base position to increase the constraint as the system approaches the positional limits. With the modified constraint logic, accelerations larger than originally expected are limited while conventional linear filters would result in automatic shutdown of the motion base. In addition, the modified washout logic has frequency-response characteristics that are an improvement over conventional linear filters with braking for low-frequency pilot inputs. During simulated landing approaches of an externally blown flap short take-off and landing (STOL) transport using decoupled longitudinal controls, the pilots were unable to detect much difference between the modified constraint logic and the logic based on linear filters with braking.
Simulation Models for Socioeconomic Inequalities in Health: A Systematic Review
Speybroeck, Niko; Van Malderen, Carine; Harper, Sam; Müller, Birgit; Devleesschauwer, Brecht
2013-01-01
Background: The emergence and evolution of socioeconomic inequalities in health involves multiple factors interacting with each other at different levels. Simulation models are suitable for studying such complex and dynamic systems and have the ability to test the impact of policy interventions in silico. Objective: To explore how simulation models were used in the field of socioeconomic inequalities in health. Methods: An electronic search of studies assessing socioeconomic inequalities in health using a simulation model was conducted. Characteristics of the simulation models were extracted and distinct simulation approaches were identified. As an illustration, a simple agent-based model of the emergence of socioeconomic differences in alcohol abuse was developed. Results: We found 61 studies published between 1989 and 2013. Ten different simulation approaches were identified. The agent-based model illustration showed that multilevel, reciprocal and indirect effects of social determinants on health can be modeled flexibly. Discussion and Conclusions: Based on the review, we discuss the utility of using simulation models for studying health inequalities, and refer to good modeling practices for developing such models. The review and the simulation model example suggest that the use of simulation models may enhance the understanding and debate about existing and new socioeconomic inequalities of health frameworks. PMID:24192788
The Simplified Aircraft-Based Paired Approach With the ALAS Alerting Algorithm
NASA Technical Reports Server (NTRS)
Perry, Raleigh B.; Madden, Michael M.; Torres-Pomales, Wilfredo; Butler, Ricky W.
2013-01-01
This paper presents the results of an investigation of a proposed concept for closely spaced parallel runways called the Simplified Aircraft-based Paired Approach (SAPA). This procedure depends upon a new alerting algorithm called the Adjacent Landing Alerting System (ALAS). This study used both low fidelity and high fidelity simulations to validate the SAPA procedure and test the performance of the new alerting algorithm. The low fidelity simulation enabled a determination of minimum approach distance for the worst case over millions of scenarios. The high fidelity simulation enabled an accurate determination of timings and minimum approach distance in the presence of realistic trajectories, communication latencies, and total system error for 108 test cases. The SAPA procedure and the ALAS alerting algorithm were applied to the 750-ft parallel spacing (e.g., SFO 28L/28R) approach problem. With the SAPA procedure as defined in this paper, this study concludes that a 750-ft application does not appear to be feasible, but preliminary results for 1000-ft parallel runways look promising.
Jason M. Forthofer; Bret W. Butler; Charles W. McHugh; Mark A. Finney; Larry S. Bradshaw; Richard D. Stratton; Kyle S. Shannon; Natalie S. Wagenbrenner
2014-01-01
The effect of fine-resolution wind simulations on fire growth simulations is explored. The wind models are (1) a wind field consisting of constant speed and direction applied everywhere over the area of interest; (2) a tool based on the solution of the conservation of mass only (termed mass-conserving model) and (3) a tool based on a solution of conservation of mass...
ERIC Educational Resources Information Center
Chang, S. -H.; Chen, M. -L.; Kuo, Y. -K.; Shen, Y. -C.
2011-01-01
In response to the growing industrial demand for light-emitting diode (LED) design professionals, based on industry-university collaboration in Taiwan, this paper develops a novel instructional approach: a simulation-based learning course with peer assessment to develop students' professional skills in LED design as required by industry as well as…
NASA Astrophysics Data System (ADS)
Nunes, João Pedro; Catarina Simões Vieira, Diana; Keizer, Jan Jacob
2017-04-01
Fires impact soil hydrological properties, enhancing soil water repellency and therefore increasing the potential for surface runoff generation and soil erosion. In consequence, the successful application of hydrological models to post-fire conditions requires the appropriate simulation of the effects of soil water repellency on soil hydrology. This work compared three approaches to model soil water repellency impacts on soil hydrology in burnt eucalypt and pine forest slopes in central Portugal: 1) Daily approach, simulating repellency as a function of soil moisture, and influencing the maximum soil available water holding capacity. It is based on the Thornthwaite-Mather soil water modelling approach, and is parameterized with the soil's wilting point and field capacity, and a parameter relating soil water repellency with water holding capacity. It was tested with soil moisture data from burnt and unburnt hillslopes. This approach was able to simulate post-fire soil moisture patterns, which the model without repellency was unable to do. However, model parameters were different between the burnt and unburnt slopes, indicating that more research is needed to derive standardized parameters from commonly measured soil and vegetation properties. 2) Seasonal approach, pre-determining repellency at the seasonal scale (3 months) in four classes (from none to extreme). It is based on the Morgan-Morgan-Finney (MMF) runoff and erosion model, applied at the seasonal scale and is parameterized with a parameter relating repellency class with field capacity. It was tested with runoff and erosion data from several experimental plots, and led to important improvements on runoff prediction over an approach with constant field capacity for all seasons (calibrated for repellency effects), but only slight improvements in erosion predictions. In contrast with the daily approach, the parameters could be reproduced between different sites 3) Constant approach, specifying values for soil water repellency for the three years after the fire, and keeping them constant throughout the year. It is based on a daily Curve Number (CN) approach, and was incorporated directly in the Soil and Water Assessment Tool (SWAT) model and tested with erosion data from a burnt hillslope. This approach was able to successfully reproduce soil erosion. The results indicate that simplified approaches can be used to adapt existing models for post-fire simulation, taking repellency into account. Taking into account the seasonality of repellency seems more important to simulate surface runoff than erosion, possibly since simulating the larger runoff rates correctly is sufficient for erosion simulation. The constant approach can be applied directly in the parameterization of existing runoff and erosion models for soil loss and sediment yield prediction, while the seasonal approach can readily be developed as a next step, with further work being needed to assess if the approach and associated parameters can be applied in multiple post-fire environments.
Equation-oriented specification of neural models for simulations
Stimberg, Marcel; Goodman, Dan F. M.; Benichoux, Victor; Brette, Romain
2013-01-01
Simulating biological neuronal networks is a core method of research in computational neuroscience. A full specification of such a network model includes a description of the dynamics and state changes of neurons and synapses, as well as the synaptic connectivity patterns and the initial values of all parameters. A standard approach in neuronal modeling software is to build network models based on a library of pre-defined components and mechanisms; if a model component does not yet exist, it has to be defined in a special-purpose or general low-level language and potentially be compiled and linked with the simulator. Here we propose an alternative approach that allows flexible definition of models by writing textual descriptions based on mathematical notation. We demonstrate that this approach allows the definition of a wide range of models with minimal syntax. Furthermore, such explicit model descriptions allow the generation of executable code for various target languages and devices, since the description is not tied to an implementation. Finally, this approach also has advantages for readability and reproducibility, because the model description is fully explicit, and because it can be automatically parsed and transformed into formatted descriptions. The presented approach has been implemented in the Brian2 simulator. PMID:24550820
Reusable Component Model Development Approach for Parallel and Distributed Simulation
Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng
2014-01-01
Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751
NASA Astrophysics Data System (ADS)
Bertin, N.; Upadhyay, M. V.; Pradalier, C.; Capolungo, L.
2015-09-01
In this paper, we propose a novel full-field approach based on the fast Fourier transform (FFT) technique to compute mechanical fields in periodic discrete dislocation dynamics (DDD) simulations for anisotropic materials: the DDD-FFT approach. By coupling the FFT-based approach to the discrete continuous model, the present approach benefits from the high computational efficiency of the FFT algorithm, while allowing for a discrete representation of dislocation lines. It is demonstrated that the computational time associated with the new DDD-FFT approach is significantly lower than that of current DDD approaches when large number of dislocation segments are involved for isotropic and anisotropic elasticity, respectively. Furthermore, for fine Fourier grids, the treatment of anisotropic elasticity comes at a similar computational cost to that of isotropic simulation. Thus, the proposed approach paves the way towards achieving scale transition from DDD to mesoscale plasticity, especially due to the method’s ability to incorporate inhomogeneous elasticity.
2006-09-01
The aim of the two parts of the experiment was identical: To explore concepts and supporting tools for Effects Based Approach to Operations (EBAO...feedback on the PMESII factors over time and the degree of achievement of the Operational Endstate. Modelling & Simulation Support to the Effects ...specific situation depends also on his interests. GAMMA provides two different methods: 1. The importance for different PMESII factors (ie potential
Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris
2010-01-01
Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.
Concepts, challenges, and successes in modeling thermodynamics of metabolism.
Cannon, William R
2014-01-01
The modeling of the chemical reactions involved in metabolism is a daunting task. Ideally, the modeling of metabolism would use kinetic simulations, but these simulations require knowledge of the thousands of rate constants involved in the reactions. The measurement of rate constants is very labor intensive, and hence rate constants for most enzymatic reactions are not available. Consequently, constraint-based flux modeling has been the method of choice because it does not require the use of the rate constants of the law of mass action. However, this convenience also limits the predictive power of constraint-based approaches in that the law of mass action is used only as a constraint, making it difficult to predict metabolite levels or energy requirements of pathways. An alternative to both of these approaches is to model metabolism using simulations of states rather than simulations of reactions, in which the state is defined as the set of all metabolite counts or concentrations. While kinetic simulations model reactions based on the likelihood of the reaction derived from the law of mass action, states are modeled based on likelihood ratios of mass action. Both approaches provide information on the energy requirements of metabolic reactions and pathways. However, modeling states rather than reactions has the advantage that the parameters needed to model states (chemical potentials) are much easier to determine than the parameters needed to model reactions (rate constants). Herein, we discuss recent results, assumptions, and issues in using simulations of state to model metabolism.
Virtual reality-based simulators for spine surgery: a systematic review.
Pfandler, Michael; Lazarovici, Marc; Stefan, Philipp; Wucherer, Patrick; Weigl, Matthias
2017-09-01
Virtual reality (VR)-based simulators offer numerous benefits and are very useful in assessing and training surgical skills. Virtual reality-based simulators are standard in some surgical subspecialties, but their actual use in spinal surgery remains unclear. Currently, only technical reviews of VR-based simulators are available for spinal surgery. Thus, we performed a systematic review that examined the existing research on VR-based simulators in spinal procedures. We also assessed the quality of current studies evaluating VR-based training in spinal surgery. Moreover, we wanted to provide a guide for future studies evaluating VR-based simulators in this field. This is a systematic review of the current scientific literature regarding VR-based simulation in spinal surgery. Five data sources were systematically searched to identify relevant peer-reviewed articles regarding virtual, mixed, or augmented reality-based simulators in spinal surgery. A qualitative data synthesis was performed with particular attention to evaluation approaches and outcomes. Additionally, all included studies were appraised for their quality using the Medical Education Research Study Quality Instrument (MERSQI) tool. The initial review identified 476 abstracts and 63 full texts were then assessed by two reviewers. Finally, 19 studies that examined simulators for the following procedures were selected: pedicle screw placement, vertebroplasty, posterior cervical laminectomy and foraminotomy, lumbar puncture, facet joint injection, and spinal needle insertion and placement. These studies had a low-to-medium methodological quality with a MERSQI mean score of 11.47 out of 18 (standard deviation=1.81). This review described the current state and applications of VR-based simulator training and assessment approaches in spinal procedures. Limitations, strengths, and future advancements of VR-based simulators for training and assessment in spinal surgery were explored. Higher-quality studies with patient-related outcome measures are needed. To establish further adaptation of VR-based simulators in spinal surgery, future evaluations need to improve the study quality, apply long-term study designs, and examine non-technical skills, as well as multidisciplinary team training. Copyright © 2017 Elsevier Inc. All rights reserved.
Guidance law simulation studies for complex approaches using the Microwave Landing System (MLS)
NASA Technical Reports Server (NTRS)
Feather, J. B.
1986-01-01
This report documents results for MLS guidance algorithm development conducted by DAC for NASA under the Advance Transport Operating Systems (ATOPS) Technology Studies program (NAS1-18028). The study consisted of evaluating guidance laws for vertical and lateral path control, as well as speed control, by simulating an MLS approach for the Washington National Airport. This work is an extension and generalization of a previous ATOPS contract (NAS1-16202) completed by DAC in 1985. The Washington river approach was simulated by six waypoints and one glideslope change and consisted of an eleven nautical mile approach path. Tracking performance was generated for 10 cases representing several different conditions, which included MLS noise, steady wind, turbulence, and windshear. Results of this simulation phase are suitable for use in future fixed-base simulator evaluations employing actual hardware (autopilot and a performance management system), as well as crew procedures and information requirements for MLS.
2006-06-01
dynamic programming approach known as a “rolling horizon” approach. This method accounts for state transitions within the simulation rather than modeling ... model is based on the framework developed for Dynamic Allocation of Fires and Sensors used to evaluate factors associated with networking assets in the...of UAVs required by all types of maneuver and support brigades. (Witsken, 2004) The Modeling , Virtual Environments, and Simulations Institute
Effects of web-based electrocardiography simulation on strategies and learning styles.
Granero-Molina, José; Fernández-Sola, Cayetano; López-Domene, Esperanza; Hernández-Padilla, José Manuel; Preto, Leonel São Romão; Castro-Sánchez, Adelaida María
2015-08-01
To identify the association between the use of web simulation electrocardiography and the learning approaches, strategies and styles of nursing degree students. A descriptive and correlational design with a one-group pretest-posttest measurement was used. The study sample included 246 students in a Basic and Advanced Cardiac Life Support nursing class of nursing degree. No significant differences between genders were found in any dimension of learning styles and approaches to learning. After the introduction of web simulation electrocardiography, significant differences were found in some item scores of learning styles: theorist (p < 0.040), pragmatic (p < 0.010) and approaches to learning. The use of a web electrocardiogram (ECG) simulation is associated with the development of active and reflexive learning styles, improving motivation and a deep approach in nursing students.
NASA Astrophysics Data System (ADS)
Petsev, Nikolai D.; Leal, L. Gary; Shell, M. Scott
2017-12-01
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely resolved (e.g., molecular dynamics) and coarse-grained (e.g., continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 084115 (2016)], simulated using a particle-based continuum method known as smoothed dissipative particle dynamics. An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.
Next Generation Extended Lagrangian Quantum-based Molecular Dynamics
NASA Astrophysics Data System (ADS)
Negre, Christian
2017-06-01
A new framework for extended Lagrangian first-principles molecular dynamics simulations is presented, which overcomes shortcomings of regular, direct Born-Oppenheimer molecular dynamics, while maintaining important advantages of the unified extended Lagrangian formulation of density functional theory pioneered by Car and Parrinello three decades ago. The new framework allows, for the first time, energy conserving, linear-scaling Born-Oppenheimer molecular dynamics simulations, which is necessary to study larger and more realistic systems over longer simulation times than previously possible. Expensive, self-consinstent-field optimizations are avoided and normal integration time steps of regular, direct Born-Oppenheimer molecular dynamics can be used. Linear scaling electronic structure theory is presented using a graph-based approach that is ideal for parallel calculations on hybrid computer platforms. For the first time, quantum based Born-Oppenheimer molecular dynamics simulation is becoming a practically feasible approach in simulations of +100,000 atoms-representing a competitive alternative to classical polarizable force field methods. In collaboration with: Anders Niklasson, Los Alamos National Laboratory.
ERIC Educational Resources Information Center
Chabalengula, Vivien; Fateen, Rasheta; Mumba, Frackson; Ochs, Laura Kathryn
2016-01-01
This study investigated the effect of an inquiry-based computer simulation modeling (ICoSM) instructional approach on pre-service science teachers' understanding of homeostasis and its related concepts, and their perceived design features of the ICoSM and simulation that enhanced their conceptual understanding of these concepts. Fifty pre-service…
PSYCHE: An Object-Oriented Approach to Simulating Medical Education
Mullen, Jamie A.
1990-01-01
Traditional approaches to computer-assisted instruction (CAI) do not provide realistic simulations of medical education, in part because they do not utilize heterogeneous knowledge bases for their source of domain knowledge. PSYCHE, a CAI program designed to teach hypothetico-deductive psychiatric decision-making to medical students, uses an object-oriented implementation of an intelligent tutoring system (ITS) to model the student, domain expert, and tutor. It models the transactions between the participants in complex transaction chains, and uses heterogeneous knowledge bases to represent both domain and procedural knowledge in clinical medicine. This object-oriented approach is a flexible and dynamic approach to modeling, and represents a potentially valuable tool for the investigation of medical education and decision-making.
Agent-based modeling: a new approach for theory building in social psychology.
Smith, Eliot R; Conrey, Frederica R
2007-02-01
Most social and psychological phenomena occur not as the result of isolated decisions by individuals but rather as the result of repeated interactions between multiple individuals over time. Yet the theory-building and modeling techniques most commonly used in social psychology are less than ideal for understanding such dynamic and interactive processes. This article describes an alternative approach to theory building, agent-based modeling (ABM), which involves simulation of large numbers of autonomous agents that interact with each other and with a simulated environment and the observation of emergent patterns from their interactions. The authors believe that the ABM approach is better able than prevailing approaches in the field, variable-based modeling (VBM) techniques such as causal modeling, to capture types of complex, dynamic, interactive processes so important in the social world. The article elaborates several important contrasts between ABM and VBM and offers specific recommendations for learning more and applying the ABM approach.
Applications of Agent Based Approaches in Business (A Three Essay Dissertation)
ERIC Educational Resources Information Center
Prawesh, Shankar
2013-01-01
The goal of this dissertation is to investigate the enabling role that agent based simulation plays in business and policy. The aforementioned issue has been addressed in this dissertation through three distinct, but related essays. The first essay is a literature review of different research applications of agent based simulation in various…
Fast Photon Monte Carlo for Water Cherenkov Detectors
NASA Astrophysics Data System (ADS)
Latorre, Anthony; Seibert, Stanley
2012-03-01
We present Chroma, a high performance optical photon simulation for large particle physics detectors, such as the water Cerenkov far detector option for LBNE. This software takes advantage of the CUDA parallel computing platform to propagate photons using modern graphics processing units. In a computer model of a 200 kiloton water Cerenkov detector with 29,000 photomultiplier tubes, Chroma can propagate 2.5 million photons per second, around 200 times faster than the same simulation with Geant4. Chroma uses a surface based approach to modeling geometry which offers many benefits over a solid based modelling approach which is used in other simulations like Geant4.
Efficient rejection-based simulation of biochemical reactions with stochastic noise and delays
NASA Astrophysics Data System (ADS)
Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto
2014-10-01
We propose a new exact stochastic rejection-based simulation algorithm for biochemical reactions and extend it to systems with delays. Our algorithm accelerates the simulation by pre-computing reaction propensity bounds to select the next reaction to perform. Exploiting such bounds, we are able to avoid recomputing propensities every time a (delayed) reaction is initiated or finished, as is typically necessary in standard approaches. Propensity updates in our approach are still performed, but only infrequently and limited for a small number of reactions, saving computation time and without sacrificing exactness. We evaluate the performance improvement of our algorithm by experimenting with concrete biological models.
A simplified DEM-CFD approach for pebble bed reactor simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Y.; Ji, W.
In pebble bed reactors (PBR's), the pebble flow and the coolant flow are coupled with each other through coolant-pebble interactions. Approaches with different fidelities have been proposed to simulate similar phenomena. Coupled Discrete Element Method-Computational Fluid Dynamics (DEM-CFD) approaches are widely studied and applied in these problems due to its good balance between efficiency and accuracy. In this work, based on the symmetry of the PBR geometry, a simplified 3D-DEM/2D-CFD approach is proposed to speed up the DEM-CFD simulation without significant loss of accuracy. Pebble flow is simulated by a full 3-D DEM, while the coolant flow field is calculatedmore » with a 2-D CFD simulation by averaging variables along the annular direction in the cylindrical geometry. Results show that this simplification can greatly enhance the efficiency for cylindrical core, which enables further inclusion of other physics such as thermal and neutronic effect in the multi-physics simulations for PBR's. (authors)« less
NASA Astrophysics Data System (ADS)
Li, Huidong; Wolter, Michael; Wang, Xun; Sodoudi, Sahar
2017-09-01
Urban-rural difference of land cover is the key determinant of urban heat island (UHI). In order to evaluate the impact of land cover data on the simulation of UHI, a comparative study between up-to-date CORINE land cover (CLC) and Urban Atlas (UA) with fine resolution (100 and 10 m) and old US Geological Survey (USGS) data with coarse resolution (30 s) was conducted using the Weather Research and Forecasting model (WRF) coupled with bulk approach of Noah-LSM for Berlin. The comparison between old data and new data partly reveals the effect of urbanization on UHI and the historical evolution of UHI, while the comparison between different resolution data reveals the impact of resolution of land cover on the simulation of UHI. Given the high heterogeneity of urban surface and the fine-resolution land cover data, the mosaic approach was implemented in this study to calculate the sub-grid variability in land cover compositions. Results showed that the simulations using UA and CLC data perform better than that using USGS data for both air and land surface temperatures. USGS-based simulation underestimates the temperature, especially in rural areas. The longitudinal variations of both temperature and land surface temperature show good agreement with urban fraction for all the three simulations. To better study the comprehensive characteristic of UHI over Berlin, the UHI curves (UHIC) are developed for all the three simulations based on the relationship between temperature and urban fraction. CLC- and UA-based simulations show smoother UHICs than USGS-based simulation. The simulation with old USGS data obviously underestimates the extent of UHI, while the up-to-date CLC and UA data better reflect the real urbanization and simulate the spatial distribution of UHI more accurately. However, the intensity of UHI simulated by CLC and UA data is not higher than that simulated by USGS data. The simulated air temperature is not dominated by the land cover as much as the land surface temperature, as air temperature is also affected by air advection.
Improved Satellite-based Crop Yield Mapping by Spatially Explicit Parameterization of Crop Phenology
NASA Astrophysics Data System (ADS)
Jin, Z.; Azzari, G.; Lobell, D. B.
2016-12-01
Field-scale mapping of crop yields with satellite data often relies on the use of crop simulation models. However, these approaches can be hampered by inaccuracies in the simulation of crop phenology. Here we present and test an approach to use dense time series of Landsat 7 and 8 acquisitions data to calibrate various parameters related to crop phenology simulation, such as leaf number and leaf appearance rates. These parameters are then mapped across the Midwestern United States for maize and soybean, and for two different simulation models. We then implement our recently developed Scalable satellite-based Crop Yield Mapper (SCYM) with simulations reflecting the improved phenology parameterizations, and compare to prior estimates based on default phenology routines. Our preliminary results show that the proposed method can effectively alleviate the underestimation of early-season LAI by the default Agricultural Production Systems sIMulator (APSIM), and that spatially explicit parameterization for the phenology model substantially improves the SCYM performance in capturing the spatiotemporal variation in maize and soybean yield. The scheme presented in our study thus preserves the scalability of SCYM, while significantly reducing its uncertainty.
Nonlinear relaxation algorithms for circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, R.A.
Circuit simulation is an important Computer-Aided Design (CAD) tool in the design of Integrated Circuits (IC). However, the standard techniques used in programs such as SPICE result in very long computer-run times when applied to large problems. In order to reduce the overall run time, a number of new approaches to circuit simulation were developed and are described. These methods are based on nonlinear relaxation techniques and exploit the relative inactivity of large circuits. Simple waveform-processing techniques are described to determine the maximum possible speed improvement that can be obtained by exploiting this property of large circuits. Three simulation algorithmsmore » are described, two of which are based on the Iterated Timing Analysis (ITA) method and a third based on the Waveform-Relaxation Newton (WRN) method. New programs that incorporate these techniques were developed and used to simulate a variety of industrial circuits. The results from these simulations are provided. The techniques are shown to be much faster than the standard approach. In addition, a number of parallel aspects of these algorithms are described, and a general space-time model of parallel-task scheduling is developed.« less
An intersubject variable regional anesthesia simulator with a virtual patient architecture.
Ullrich, Sebastian; Grottke, Oliver; Fried, Eduard; Frommen, Thorsten; Liao, Wei; Rossaint, Rolf; Kuhlen, Torsten; Deserno, Thomas M
2009-11-01
The main purpose is to provide an intuitive VR-based training environment for regional anesthesia (RA). The research question is how to process subject-specific datasets, organize them in a meaningful way and how to perform the simulation for peripheral regions. We propose a flexible virtual patient architecture and methods to process datasets. Image acquisition, image processing (especially segmentation), interactive nerve modeling and permutations (nerve instantiation) are described in detail. The simulation of electric impulse stimulation and according responses are essential for the training of peripheral RA and solved by an approach based on the electric distance. We have created an XML-based virtual patient database with several subjects. Prototypes of the simulation are implemented and run on multimodal VR hardware (e.g., stereoscopic display and haptic device). A first user pilot study has confirmed our approach. The virtual patient architecture enables support for arbitrary scenarios on different subjects. This concept can also be used for other simulators. In future work, we plan to extend the simulation and conduct further evaluations in order to provide a tool for routine training for RA.
Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations
NASA Astrophysics Data System (ADS)
Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying
2010-09-01
Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).
A hardware-in-the-loop simulation program for ground-based radar
NASA Astrophysics Data System (ADS)
Lam, Eric P.; Black, Dennis W.; Ebisu, Jason S.; Magallon, Julianna
2011-06-01
A radar system created using an embedded computer system needs testing. The way to test an embedded computer system is different from the debugging approaches used on desktop computers. One way to test a radar system is to feed it artificial inputs and analyze the outputs of the radar. More often, not all of the building blocks of the radar system are available to test. This will require the engineer to test parts of the radar system using a "black box" approach. A common way to test software code on a desktop simulation is to use breakpoints so that is pauses after each cycle through its calculations. The outputs are compared against the values that are expected. This requires the engineer to use valid test scenarios. We will present a hardware-in-the-loop simulator that allows the embedded system to think it is operating with real-world inputs and outputs. From the embedded system's point of view, it is operating in real-time. The hardware in the loop simulation is based on our Desktop PC Simulation (PCS) testbed. In the past, PCS was used for ground-based radars. This embedded simulation, called Embedded PCS, allows a rapid simulated evaluation of ground-based radar performance in a laboratory environment.
An EQT-cDFT approach to determine thermodynamic properties of confined fluids.
Mashayak, S Y; Motevaselian, M H; Aluru, N R
2015-06-28
We present a continuum-based approach to predict the structure and thermodynamic properties of confined fluids at multiple length-scales, ranging from a few angstroms to macro-meters. The continuum approach is based on the empirical potential-based quasi-continuum theory (EQT) and classical density functional theory (cDFT). EQT is a simple and fast approach to predict inhomogeneous density and potential profiles of confined fluids. We use EQT potentials to construct a grand potential functional for cDFT. The EQT-cDFT-based grand potential can be used to predict various thermodynamic properties of confined fluids. In this work, we demonstrate the EQT-cDFT approach by simulating Lennard-Jones fluids, namely, methane and argon, confined inside slit-like channels of graphene. We show that the EQT-cDFT can accurately predict the structure and thermodynamic properties, such as density profiles, adsorption, local pressure tensor, surface tension, and solvation force, of confined fluids as compared to the molecular dynamics simulation results.
External Aiding Methods for IMU-Based Navigation
2016-11-26
Carlo simulation and particle filtering . This approach allows for the utilization of highly complex systems in a black box configuration with minimal...alternative method, which has the advantage of being less computationally demanding, is to use a Kalman filtering -based approach. The particular...Kalman filtering -based approach used here is known as linear covariance analysis. In linear covariance analysis, the nonlinear systems describing the
NASA Astrophysics Data System (ADS)
Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.
2018-02-01
Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.
Simplified and advanced modelling of traction control systems of heavy-haul locomotives
NASA Astrophysics Data System (ADS)
Spiryagin, Maksym; Wolfs, Peter; Szanto, Frank; Cole, Colin
2015-05-01
Improving tractive effort is a very complex task in locomotive design. It requires the development of not only mechanical systems but also power systems, traction machines and traction algorithms. At the initial design stage, traction algorithms can be verified by means of a simulation approach. A simple single wheelset simulation approach is not sufficient because all locomotive dynamics are not fully taken into consideration. Given that many traction control strategies exist, the best solution is to use more advanced approaches for such studies. This paper describes the modelling of a locomotive with a bogie traction control strategy based on a co-simulation approach in order to deliver more accurate results. The simplified and advanced modelling approaches of a locomotive electric power system are compared in this paper in order to answer a fundamental question. What level of modelling complexity is necessary for the investigation of the dynamic behaviours of a heavy-haul locomotive running under traction? The simulation results obtained provide some recommendations on simulation processes and the further implementation of advanced and simplified modelling approaches.
Simulation-Based Evaluation of Learning Sequences for Instructional Technologies
ERIC Educational Resources Information Center
McEneaney, John E.
2016-01-01
Instructional technologies critically depend on systematic design, and learning hierarchies are a commonly advocated tool for designing instructional sequences. But hierarchies routinely allow numerous sequences and choosing an optimal sequence remains an unsolved problem. This study explores a simulation-based approach to modeling learning…
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Bowles, R. L.
1983-01-01
This paper addresses the issues of motion/visual cueing fidelity requirements for vortex encounters during simulated transport visual approaches and landings. Four simulator configurations were utilized to provide objective performance measures during simulated vortex penetrations, and subjective comments from pilots were collected. The configurations used were as follows: fixed base with visual degradation (delay), fixed base with no visual degradation, moving base with visual degradation (delay), and moving base with no visual degradation. The statistical comparisons of the objective measures and the subjective pilot opinions indicated that although both minimum visual delay and motion cueing are recommended for the vortex penetration task, the visual-scene delay characteristics were not as significant a fidelity factor as was the presence of motion cues. However, this indication was applicable to a restricted task, and to transport aircraft. Although they were statistically significant, the effects of visual delay and motion cueing on the touchdown-related measures were considered to be of no practical consequence.
DOT National Transportation Integrated Search
2012-06-01
A small team of university-based transportation system experts and simulation experts has been : assembled to develop, test, and apply an approach to assessing road infrastructure capacity using : micro traffic simulation supported by publically avai...
Design of Accelerator Online Simulator Server Using Structured Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, Guobao; /Brookhaven; Chu, Chungming
2012-07-06
Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describesmore » the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.« less
Quantum simulation of an ultrathin body field-effect transistor with channel imperfections
NASA Astrophysics Data System (ADS)
Vyurkov, V.; Semenikhin, I.; Filippov, S.; Orlikovsky, A.
2012-04-01
An efficient program for the all-quantum simulation of nanometer field-effect transistors is elaborated. The model is based on the Landauer-Buttiker approach. Our calculation of transmission coefficients employs a transfer-matrix technique involving the arbitrary precision (multiprecision) arithmetic to cope with evanescent modes. Modified in such way, the transfer-matrix technique turns out to be much faster in practical simulations than that of scattering-matrix. Results of the simulation demonstrate the impact of realistic channel imperfections (random charged centers and wall roughness) on transistor characteristics. The Landauer-Buttiker approach is developed to incorporate calculation of the noise at an arbitrary temperature. We also validate the ballistic Landauer-Buttiker approach for the usual situation when heavily doped contacts are indispensably included into the simulation region.
Structure-based multiscale approach for identification of interaction partners of PDZ domains.
Tiwari, Garima; Mohanty, Debasisa
2014-04-28
PDZ domains are peptide recognition modules which mediate specific protein-protein interactions and are known to have a complex specificity landscape. We have developed a novel structure-based multiscale approach which identifies crucial specificity determining residues (SDRs) of PDZ domains from explicit solvent molecular dynamics (MD) simulations on PDZ-peptide complexes and uses these SDRs in combination with knowledge-based scoring functions for proteomewide identification of their interaction partners. Multiple explicit solvent simulations ranging from 5 to 50 ns duration have been carried out on 28 PDZ-peptide complexes with known binding affinities. MM/PBSA binding energy values calculated from these simulations show a correlation coefficient of 0.755 with the experimental binding affinities. On the basis of the SDRs of PDZ domains identified by MD simulations, we have developed a simple scoring scheme for evaluating binding energies for PDZ-peptide complexes using residue based statistical pair potentials. This multiscale approach has been benchmarked on a mouse PDZ proteome array data set by calculating the binding energies for 217 different substrate peptides in binding pockets of 64 different mouse PDZ domains. Receiver operating characteristic (ROC) curve analysis indicates that, the area under curve (AUC) values for binder vs nonbinder classification by our structure based method is 0.780. Our structure based method does not require experimental PDZ-peptide binding data for training.
Simulation in interprofessional education for patient-centred collaborative care.
Baker, Cynthia; Pulling, Cheryl; McGraw, Robert; Dagnone, Jeffrey Damon; Hopkins-Rosseel, Diana; Medves, Jennifer
2008-11-01
This paper is a report of preliminary evaluations of an interprofessional education through simulation project by focusing on learner and teacher reactions to the pilot modules. Approaches to interprofessional education vary widely. Studies indicate, however, that active, experiential learning facilitate it. Patient simulators require learners to incorporate knowing, being and doing in action. A theoretically based competency framework was developed to guide interprofessional education using simulation. The framework includes a typology of shared, complementary and profession-specific competencies. Each competency type is associated with an intraprofessional, multiprofessional, or interprofessional teaching modality and with the professional composition of learner groups. The project is guided by an action research approach in which ongoing evaluation generates knowledge to modify and further develop it. Preliminary evaluations of the first pilot module, cardiac resuscitation rounds, among 101 nursing students, 42 medical students and 70 junior medical residents were conducted in 2005-2007 using a questionnaire with rating scales and open-ended questions. Another 20 medical students, 7 junior residents and 45 nursing students completed a questionnaire based on the Interdisciplinary Education Perception scale. Simulation-based learning provided students with interprofessional activities they saw as relevant for their future as practitioners. They embraced both the interprofessional and simulation components enthusiastically. Attitudinal scores and responses were consistently positive among both medical and nursing students. Interprofessional education through simulation offers a promising approach to preparing future healthcare professionals for the collaborative models of healthcare delivery being developed internationally.
Gao, Yuan; Zhang, Chuanrong; He, Qingsong; Liu, Yaolin
2017-06-15
Ecological security is an important research topic, especially urban ecological security. As highly populated eco-systems, cities always have more fragile ecological environments. However, most of the research on urban ecological security in literature has focused on evaluating current or past status of the ecological environment. Very little literature has carried out simulation or prediction of future ecological security. In addition, there is even less literature exploring the urban ecological environment at a fine scale. To fill-in the literature gap, in this study we simulated and predicted urban ecological security at a fine scale (district level) using an improved Cellular Automata (CA) approach. First we used the pressure-state-response (PSR) method based on grid-scale data to evaluate urban ecological security. Then, based on the evaluation results, we imported the geographically weighted regression (GWR) concept into the CA model to simulate and predict urban ecological security. We applied the improved CA approach in a case study-simulating and predicting urban ecological security for the city of Wuhan in Central China. By comparing the simulated ecological security values from 2010 using the improved CA model to the actual ecological security values of 2010, we got a relatively high value of the kappa coefficient, which indicates that this CA model can simulate or predict well future development of ecological security in Wuhan. Based on the prediction results for 2020, we made some policy recommendations for each district in Wuhan.
The Development of a 3D LADAR Simulator Based on a Fast Target Impulse Response Generation Approach
NASA Astrophysics Data System (ADS)
Al-Temeemy, Ali Adnan
2017-09-01
A new laser detection and ranging (LADAR) simulator has been developed, using MATLAB and its graphical user interface, to simulate direct detection time of flight LADAR systems, and to produce 3D simulated scanning images under a wide variety of conditions. This simulator models each stage from the laser source to data generation and can be considered as an efficient simulation tool to use when developing LADAR systems and their data processing algorithms. The novel approach proposed for this simulator is to generate the actual target impulse response. This approach is fast and able to deal with high scanning requirements without losing the fidelity that accompanies increments in speed. This leads to a more efficient LADAR simulator and opens up the possibility for simulating LADAR beam propagation more accurately by using a large number of laser footprint samples. The approach is to select only the parts of the target that lie in the laser beam angular field by mathematically deriving the required equations and calculating the target angular ranges. The performance of the new simulator has been evaluated under different scanning conditions, the results showing significant increments in processing speeds in comparison to conventional approaches, which are also used in this study as a point of comparison for the results. The results also show the simulator's ability to simulate phenomena related to the scanning process, for example, type of noise, scanning resolution and laser beam width.
Simulation of an enzyme-based glucose sensor
NASA Astrophysics Data System (ADS)
Sha, Xianzheng; Jablecki, Michael; Gough, David A.
2001-09-01
An important biosensor application is the continuous monitoring blood or tissue fluid glucose concentration in people with diabetes. Our research focuses on the development of a glucose sensor based on potentiostatic oxygen electrodes and immobilized glucose oxidase for long- term application as an implant in tissues. As the sensor signal depends on many design variables, a trial-and-error approach to sensor optimization can be time-consuming. Here, the properties of an implantable glucose sensor are optimized by a systematic computational simulation approach.
Bao, Jie; Hou, Zhangshuan; Huang, Maoyi; ...
2015-12-04
Here, effective sensitivity analysis approaches are needed to identify important parameters or factors and their uncertainties in complex Earth system models composed of multi-phase multi-component phenomena and multiple biogeophysical-biogeochemical processes. In this study, the impacts of 10 hydrologic parameters in the Community Land Model on simulations of runoff and latent heat flux are evaluated using data from a watershed. Different metrics, including residual statistics, the Nash-Sutcliffe coefficient, and log mean square error, are used as alternative measures of the deviations between the simulated and field observed values. Four sensitivity analysis (SA) approaches, including analysis of variance based on the generalizedmore » linear model, generalized cross validation based on the multivariate adaptive regression splines model, standardized regression coefficients based on a linear regression model, and analysis of variance based on support vector machine, are investigated. Results suggest that these approaches show consistent measurement of the impacts of major hydrologic parameters on response variables, but with differences in the relative contributions, particularly for the secondary parameters. The convergence behaviors of the SA with respect to the number of sampling points are also examined with different combinations of input parameter sets and output response variables and their alternative metrics. This study helps identify the optimal SA approach, provides guidance for the calibration of the Community Land Model parameters to improve the model simulations of land surface fluxes, and approximates the magnitudes to be adjusted in the parameter values during parametric model optimization.« less
Comparison of NMR simulations of porous media derived from analytical and voxelized representations.
Jin, Guodong; Torres-Verdín, Carlos; Toumelin, Emmanuel
2009-10-01
We develop and compare two formulations of the random-walk method, grain-based and voxel-based, to simulate the nuclear-magnetic-resonance (NMR) response of fluids contained in various models of porous media. The grain-based approach uses a spherical grain pack as input, where the solid surface is analytically defined without an approximation. In the voxel-based approach, the input is a computer-tomography or computer-generated image of reconstructed porous media. Implementation of the two approaches is largely the same, except for the representation of porous media. For comparison, both approaches are applied to various analytical and digitized models of porous media: isolated spherical pore, simple cubic packing of spheres, and random packings of monodisperse and polydisperse spheres. We find that spin magnetization decays much faster in the digitized models than in their analytical counterparts. The difference in decay rate relates to the overestimation of surface area due to the discretization of the sample; it cannot be eliminated even if the voxel size decreases. However, once considering the effect of surface-area increase in the simulation of surface relaxation, good quantitative agreement is found between the two approaches. Different grain or pore shapes entail different rates of increase of surface area, whereupon we emphasize that the value of the "surface-area-corrected" coefficient may not be universal. Using an example of X-ray-CT image of Fontainebleau rock sample, we show that voxel size has a significant effect on the calculated surface area and, therefore, on the numerically simulated magnetization response.
Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude
2016-01-01
Background Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. Objective The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. Methods This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. Results This study is in its preliminary stages and the results are expected to be made available by April, 2016. Conclusions This will be the first study to explore a new simulation approach designed to enhance clinical reasoning. By assessing more closely reasoning processes throughout a simulation session, we believe that Simulation with Iterative Discussions will be an interesting and more effective approach for students. The findings of the study will benefit medical educators, education programs, and medical students. PMID:26888076
Pennaforte, Thomas; Moussa, Ahmed; Loye, Nathalie; Charlin, Bernard; Audétat, Marie-Claude
2016-02-17
Helping trainees develop appropriate clinical reasoning abilities is a challenging goal in an environment where clinical situations are marked by high levels of complexity and unpredictability. The benefit of simulation-based education to assess clinical reasoning skills has rarely been reported. More specifically, it is unclear if clinical reasoning is better acquired if the instructor's input occurs entirely after or is integrated during the scenario. Based on educational principles of the dual-process theory of clinical reasoning, a new simulation approach called simulation with iterative discussions (SID) is introduced. The instructor interrupts the flow of the scenario at three key moments of the reasoning process (data gathering, integration, and confirmation). After each stop, the scenario is continued where it was interrupted. Finally, a brief general debriefing ends the session. System-1 process of clinical reasoning is assessed by verbalization during management of the case, and System-2 during the iterative discussions without providing feedback. The aim of this study is to evaluate the effectiveness of Simulation with Iterative Discussions versus the classical approach of simulation in developing reasoning skills of General Pediatrics and Neonatal-Perinatal Medicine residents. This will be a prospective exploratory, randomized study conducted at Sainte-Justine hospital in Montreal, Qc, between January and March 2016. All post-graduate year (PGY) 1 to 6 residents will be invited to complete one SID or classical simulation 30 minutes audio video-recorded complex high-fidelity simulations covering a similar neonatology topic. Pre- and post-simulation questionnaires will be completed and a semistructured interview will be conducted after each simulation. Data analyses will use SPSS and NVivo softwares. This study is in its preliminary stages and the results are expected to be made available by April, 2016. This will be the first study to explore a new simulation approach designed to enhance clinical reasoning. By assessing more closely reasoning processes throughout a simulation session, we believe that Simulation with Iterative Discussions will be an interesting and more effective approach for students. The findings of the study will benefit medical educators, education programs, and medical students.
Snyder, Christopher W; Vandromme, Marianne J; Tyra, Sharon L; Hawn, Mary T
2009-01-01
Virtual reality (VR) simulators for laparoscopy and endoscopy may be valuable tools for resident education. However, the cost of such training in terms of trainee and instructor time may vary depending upon whether an independent or proctored approach is employed. We performed a randomized controlled trial to compare independent and proctored methods of proficiency-based VR simulator training. Medical students were randomized to independent or proctored training groups. Groups were compared with respect to the number of training hours and task repetitions required to achieve expert level proficiency on laparoscopic and endoscopic simulators. Cox regression modeling was used to compare time to proficiency between groups, with adjustment for appropriate covariates. Thirty-six medical students (18 independent, 18 proctored) were enrolled. Achievement of overall simulator proficiency required a median of 11 hours of training (range, 6-21 hours). Laparoscopic and endoscopic proficiency were achieved after a median of 11 (range, 6-32) and 10 (range, 5-27) task repetitions, respectively. The number of repetitions required to achieve proficiency was similar between groups. After adjustment for covariates, trainees in the independent group achieved simulator proficiency with significantly fewer hours of training (hazard ratio, 2.62; 95% confidence interval, 1.01-6.85; p = 0.048). Our study quantifies the cost, in instructor and trainee hours, of proficiency-based laparoscopic and endoscopic VR simulator training, and suggests that proctored instruction does not offer any advantages to trainees. The independent approach may be preferable for surgical residency programs desiring to implement VR simulator training.
Direct evaluation of free energy for large system through structure integration approach.
Takeuchi, Kazuhito; Tanaka, Ryohei; Yuge, Koretaka
2015-09-30
We propose a new approach, 'structure integration', enabling direct evaluation of configurational free energy for large systems. The present approach is based on the statistical information of lattice. Through first-principles-based simulation, we find that the present method evaluates configurational free energy accurately in disorder states above critical temperature.
Understanding Cryptic Pocket Formation in Protein Targets by Enhanced Sampling Simulations.
Oleinikovas, Vladimiras; Saladino, Giorgio; Cossins, Benjamin P; Gervasio, Francesco L
2016-11-02
Cryptic pockets, that is, sites on protein targets that only become apparent when drugs bind, provide a promising alternative to classical binding sites for drug development. Here, we investigate the nature and dynamical properties of cryptic sites in four pharmacologically relevant targets, while comparing the efficacy of various simulation-based approaches in discovering them. We find that the studied cryptic sites do not correspond to local minima in the computed conformational free energy landscape of the unliganded proteins. They thus promptly close in all of the molecular dynamics simulations performed, irrespective of the force-field used. Temperature-based enhanced sampling approaches, such as Parallel Tempering, do not improve the situation, as the entropic term does not help in the opening of the sites. The use of fragment probes helps, as in long simulations occasionally it leads to the opening and binding to the cryptic sites. Our observed mechanism of cryptic site formation is suggestive of an interplay between two classical mechanisms: induced-fit and conformational selection. Employing this insight, we developed a novel Hamiltonian Replica Exchange-based method "SWISH" (Sampling Water Interfaces through Scaled Hamiltonians), which combined with probes resulted in a promising general approach for cryptic site discovery. We also addressed the issue of "false-positives" and propose a simple approach to distinguish them from druggable cryptic pockets. Our simulations, whose cumulative sampling time was more than 200 μs, help in clarifying the molecular mechanism of pocket formation, providing a solid basis for the choice of an efficient computational method.
Approaches to Classroom-Based Computational Science.
ERIC Educational Resources Information Center
Guzdial, Mark
Computational science includes the use of computer-based modeling and simulation to define and test theories about scientific phenomena. The challenge for educators is to develop techniques for implementing computational science in the classroom. This paper reviews some previous work on the use of simulation alone (without modeling), modeling…
Similar negative impacts of temperature on global wheat yield estimated by three independent methods
USDA-ARS?s Scientific Manuscript database
The potential impact of global temperature change on global wheat production has recently been assessed with different methods, scaling and aggregation approaches. Here we show that grid-based simulations, point-based simulations, and statistical regressions produce similar estimates of temperature ...
High fidelity case-based simulation debriefing: everything you need to know.
Hart, Danielle; McNeil, Mary Ann; Griswold-Theodorson, Sharon; Bhatia, Kriti; Joing, Scott
2012-09-01
In this 30-minute talk, the authors take an in-depth look at how to debrief high-fidelity case-based simulation sessions, including discussion on debriefing theory, goals, approaches, and structure, as well as ways to create a supportive and safe learning environment, resulting in successful small group learning and self-reflection. Emphasis is placed on the "debriefing with good judgment" approach. Video clips of sample debriefing attempts, highlighting the "dos and don'ts" of simulation debriefing, are included. The goal of this talk is to provide you with the necessary tools and information to develop a successful and effective debriefing approach. There is a bibliography and a quick reference guide in Data Supplements S1 and S2 (available as supporting information in the online version of this paper). © 2012 by the Society for Academic Emergency Medicine.
Simulation in Higher Education: A Sociomaterial View
ERIC Educational Resources Information Center
Hopwood, Nick; Rooney, Donna; Boud, David; Kelly, Michelle
2016-01-01
This article presents a sociomaterial account of simulation in higher education. Sociomaterial approaches change the ontological and epistemological bases for understanding learning and offer valuable tools for addressing important questions about relationships between university education and professional practices. Simulation has grown in many…
Rezaeian, Sanaz; Hartzell, Stephen; Sun, Xiaodan; Mendoza, Carlos
2017-01-01
Earthquake ground‐motion recordings are scarce in the central and eastern United States (CEUS) for large‐magnitude events and at close distances. We use two different simulation approaches, a deterministic physics‐based method and a site‐based stochastic method, to simulate ground motions over a wide range of magnitudes. Drawing on previous results for the modeling of recordings from the 2011 Mw 5.8 Mineral, Virginia, earthquake and using the 2001 Mw 7.6 Bhuj, India, earthquake as a tectonic analog for a large magnitude CEUS event, we are able to calibrate the two simulation methods over this magnitude range. Both models show a good fit to the Mineral and Bhuj observations from 0.1 to 10 Hz. Model parameters are then adjusted to obtain simulations for Mw 6.5, 7.0, and 7.6 events in the CEUS. Our simulations are compared with the 2014 U.S. Geological Survey weighted combination of existing ground‐motion prediction equations in the CEUS. The physics‐based simulations show comparable response spectral amplitudes and a fairly similar attenuation with distance. The site‐based stochastic simulations suggest a slightly faster attenuation of the response spectral amplitudes with distance for larger magnitude events and, as a result, slightly lower amplitudes at distances greater than 200 km. Both models are plausible alternatives and, given the few available data points in the CEUS, can be used to represent the epistemic uncertainty in modeling of postulated CEUS large‐magnitude events.
Schwenke, Michael; Georgii, Joachim; Preusser, Tobias
2017-07-01
Focused ultrasound (FUS) is rapidly gaining clinical acceptance for several target tissues in the human body. Yet, treating liver targets is not clinically applied due to a high complexity of the procedure (noninvasiveness, target motion, complex anatomy, blood cooling effects, shielding by ribs, and limited image-based monitoring). To reduce the complexity, numerical FUS simulations can be utilized for both treatment planning and execution. These use-cases demand highly accurate and computationally efficient simulations. We propose a numerical method for the simulation of abdominal FUS treatments during respiratory motion of the organs and target. Especially, a novel approach is proposed to simulate the heating during motion by solving Pennes' bioheat equation in a computational reference space, i.e., the equation is mathematically transformed to the reference. The approach allows for motion discontinuities, e.g., the sliding of the liver along the abdominal wall. Implementing the solver completely on the graphics processing unit and combining it with an atlas-based ultrasound simulation approach yields a simulation performance faster than real time (less than 50-s computing time for 100 s of treatment time) on a modern off-the-shelf laptop. The simulation method is incorporated into a treatment planning demonstration application that allows to simulate real patient cases including respiratory motion. The high performance of the presented simulation method opens the door to clinical applications. The methods bear the potential to enable the application of FUS for moving organs.
Learner-Adaptive Educational Technology for Simulation in Healthcare: Foundations and Opportunities.
Lineberry, Matthew; Dev, Parvati; Lane, H Chad; Talbot, Thomas B
2018-06-01
Despite evidence that learners vary greatly in their learning needs, practical constraints tend to favor ''one-size-fits-all'' educational approaches, in simulation-based education as elsewhere. Adaptive educational technologies - devices and/or software applications that capture and analyze relevant data about learners to select and present individually tailored learning stimuli - are a promising aid in learners' and educators' efforts to provide learning experiences that meet individual needs. In this article, we summarize and build upon the 2017 Society for Simulation in Healthcare Research Summit panel discussion on adaptive learning. First, we consider the role of adaptivity in learning broadly. We then outline the basic functions that adaptive learning technologies must implement and the unique affordances and challenges of technology-based approaches for those functions, sharing an illustrative example from healthcare simulation. Finally, we consider future directions for accelerating research, development, and deployment of effective adaptive educational technology and techniques in healthcare simulation.
Kopyt, Paweł; Celuch, Małgorzata
2007-01-01
A practical implementation of a hybrid simulation system capable of modeling coupled electromagnetic-thermodynamic problems typical in microwave heating is described. The paper presents two approaches to modeling such problems. Both are based on an FDTD-based commercial electromagnetic solver coupled to an external thermodynamic analysis tool required for calculations of heat diffusion. The first approach utilizes a simple FDTD-based thermal solver while in the second it is replaced by a universal commercial CFD solver. The accuracy of the two modeling systems is verified against the original experimental data as well as the measurement results available in literature.
ERIC Educational Resources Information Center
Mawdesley, M.; Long, G.; Al-jibouri, S.; Scott, D.
2011-01-01
Computer based simulations and games can be useful tools in teaching aspects of construction project management that are not easily transmitted through traditional lecture based approaches. However, it can be difficult to quantify their utility and it is essential to ensure that students are achieving the learning outcomes required rather than…
Simulations and Games: Use and Barriers in Higher Education
ERIC Educational Resources Information Center
Lean, Jonathan; Moizer, Jonathan; Towler, Michael; Abbey, Caroline
2006-01-01
This article explores the use of simulations and games in tertiary education. It examines the extent to which academics use different simulation-based teaching approaches and how they perceive the barriers to adopting such techniques. Following a review of the extant literature, a typology of simulations is constructed. A staff survey within a UK…
Optimization Model for Web Based Multimodal Interactive Simulations.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2015-07-15
This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update . In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach.
Optimization Model for Web Based Multimodal Interactive Simulations
Halic, Tansel; Ahn, Woojin; De, Suvranu
2015-01-01
This paper presents a technique for optimizing the performance of web based multimodal interactive simulations. For such applications where visual quality and the performance of simulations directly influence user experience, overloading of hardware resources may result in unsatisfactory reduction in the quality of the simulation and user satisfaction. However, optimization of simulation performance on individual hardware platforms is not practical. Hence, we present a mixed integer programming model to optimize the performance of graphical rendering and simulation performance while satisfying application specific constraints. Our approach includes three distinct phases: identification, optimization and update. In the identification phase, the computing and rendering capabilities of the client device are evaluated using an exploratory proxy code. This data is utilized in conjunction with user specified design requirements in the optimization phase to ensure best possible computational resource allocation. The optimum solution is used for rendering (e.g. texture size, canvas resolution) and simulation parameters (e.g. simulation domain) in the update phase. Test results are presented on multiple hardware platforms with diverse computing and graphics capabilities to demonstrate the effectiveness of our approach. PMID:26085713
Viscous and thermal modelling of thermoplastic composites forming process
NASA Astrophysics Data System (ADS)
Guzman, Eduardo; Liang, Biao; Hamila, Nahiene; Boisse, Philippe
2016-10-01
Thermoforming thermoplastic prepregs is a fast manufacturing process. It is suitable for automotive composite parts manufacturing. The simulation of thermoplastic prepreg forming is achieved by alternate thermal and mechanical analyses. The thermal properties are obtained from a mesoscopic analysis and a homogenization procedure. The forming simulation is based on a viscous-hyperelastic approach. The thermal simulations define the coefficients of the mechanical model that depend on the temperature. The forming simulations modify the boundary conditions and the internal geometry of the thermal analyses. The comparison of the simulation with an experimental thermoforming of a part representative of automotive applications shows the efficiency of the approach.
Mirrored continuum and molecular scale simulations of the ignition of high-pressure phases of RDX
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Kibaek; Stewart, D. Scott, E-mail: santc@illinois.edu, E-mail: dss@illinois.edu; Joshi, Kaushik
2016-05-14
We present a mirrored atomistic and continuum framework that is used to describe the ignition of energetic materials, and a high-pressure phase of RDX in particular. The continuum formulation uses meaningful averages of thermodynamic properties obtained from the atomistic simulation and a simplification of enormously complex reaction kinetics. In particular, components are identified based on molecular weight bin averages and our methodology assumes that both the averaged atomistic and continuum simulations are represented on the same time and length scales. The atomistic simulations of thermally initiated ignition of RDX are performed using reactive molecular dynamics (RMD). The continuum model ismore » based on multi-component thermodynamics and uses a kinetics scheme that describes observed chemical changes of the averaged atomistic simulations. Thus the mirrored continuum simulations mimic the rapid change in pressure, temperature, and average molecular weight of species in the reactive mixture. This mirroring enables a new technique to simplify the chemistry obtained from reactive MD simulations while retaining the observed features and spatial and temporal scales from both the RMD and continuum model. The primary benefit of this approach is a potentially powerful, but familiar way to interpret the atomistic simulations and understand the chemical events and reaction rates. The approach is quite general and thus can provide a way to model chemistry based on atomistic simulations and extend the reach of those simulations.« less
NASA Astrophysics Data System (ADS)
Dörr, Dominik; Schirmaier, Fabian J.; Henning, Frank; Kärger, Luise
2017-10-01
Finite Element (FE) forming simulation offers the possibility of a detailed analysis of the deformation behavior of multilayered thermoplastic blanks during forming, considering material behavior and process conditions. Rate-dependent bending behavior is a material characteristic, which is so far not considered in FE forming simulation of pre-impregnated, continuously fiber reinforced polymers (CFRPs). Therefore, an approach for modeling viscoelastic bending behavior in FE composite forming simulation is presented in this work. The presented approach accounts for the distinct rate-dependent bending behavior of e.g. thermoplastic CFRPs at process conditions. The approach is based on a Voigt-Kelvin (VK) and a generalized Maxwell (GM) approach, implemented within a FE forming simulation framework implemented in several user-subroutines of the commercially available FE solver Abaqus. The VK, GM, as well as purely elastic bending modeling approaches are parameterized according to dynamic bending characterization results for a PA6-CF UD-tape. It is found that only the GM approach is capable to represent the bending deformation characteristic for all of the considered bending deformation rates. The parameterized bending modeling approaches are applied to a hemisphere test and to a generic geometry. A comparison of the forming simulation results of the generic geometry to experimental tests show a good agreement between simulation and experiments. Furthermore, the simulation results reveal that especially a correct modeling of the initial bending stiffness is relevant for the prediction of wrinkling behavior, as a similar onset of wrinkles is observed for the GM, the VK and an elastic approach, fitted to the stiffness observed in the dynamic rheometer test for low curvatures. Hence, characterization and modeling of rate-dependent bending behavior is crucial for FE forming simulation of thermoplastic CFRPs.
Winkler, Daniel; Zischg, Jonatan; Rauch, Wolfgang
2018-01-01
For communicating urban flood risk to authorities and the public, a realistic three-dimensional visual display is frequently more suitable than detailed flood maps. Virtual reality could also serve to plan short-term flooding interventions. We introduce here an alternative approach for simulating three-dimensional flooding dynamics in large- and small-scale urban scenes by reaching out to computer graphics. This approach, denoted 'particle in cell', is a particle-based CFD method that is used to predict physically plausible results instead of accurate flow dynamics. We exemplify the approach for the real flooding event in July 2016 in Innsbruck.
Kyriakou, Adamos; Neufeld, Esra; Werner, Beat; Székely, Gábor; Kuster, Niels
2015-01-01
Transcranial focused ultrasound (tcFUS) is an attractive noninvasive modality for neurosurgical interventions. The presence of the skull, however, compromises the efficiency of tcFUS therapy, as its heterogeneous nature and acoustic characteristics induce significant distortion of the acoustic energy deposition, focal shifts, and thermal gain decrease. Phased-array transducers allow for partial compensation of skull-induced aberrations by application of precalculated phase and amplitude corrections. An integrated numerical framework allowing for 3D full-wave, nonlinear acoustic and thermal simulations has been developed and applied to tcFUS. Simulations were performed to investigate the impact of skull aberrations, the possibility of extending the treatment envelope, and adverse secondary effects. The simulated setup comprised an idealized model of the ExAblate Neuro and a detailed MR-based anatomical head model. Four different approaches were employed to calculate aberration corrections (analytical calculation of the aberration corrections disregarding tissue heterogeneities; a semi-analytical ray-tracing approach compensating for the presence of the skull; two simulation-based time-reversal approaches with and without pressure amplitude corrections which account for the entire anatomy). These impact of these approaches on the pressure and temperature distributions were evaluated for 22 brain-targets. While (semi-)analytical approaches failed to induced high pressure or ablative temperatures in any but the targets in the close vicinity of the geometric focus, simulation-based approaches indicate the possibility of considerably extending the treatment envelope (including targets below the transducer level and locations several centimeters off the geometric focus), generation of sharper foci, and increased targeting accuracy. While the prediction of achievable aberration correction appears to be unaffected by the detailed bone-structure, proper consideration of inhomogeneity is required to predict the pressure distribution for given steering parameters. Simulation-based approaches to calculate aberration corrections may aid in the extension of the tcFUS treatment envelope as well as predict and avoid secondary effects (standing waves, skull heating). Due to their superior performance, simulationbased techniques may prove invaluable in the amelioration of skull-induced aberration effects in tcFUS therapy. The next steps are to investigate shear-wave-induced effects in order to reliably exclude secondary hot-spots, and to develop comprehensive uncertainty assessment and validation procedures.
DAMS: A Model to Assess Domino Effects by Using Agent-Based Modeling and Simulation.
Zhang, Laobing; Landucci, Gabriele; Reniers, Genserik; Khakzad, Nima; Zhou, Jianfeng
2017-12-19
Historical data analysis shows that escalation accidents, so-called domino effects, have an important role in disastrous accidents in the chemical and process industries. In this study, an agent-based modeling and simulation approach is proposed to study the propagation of domino effects in the chemical and process industries. Different from the analytical or Monte Carlo simulation approaches, which normally study the domino effect at probabilistic network levels, the agent-based modeling technique explains the domino effects from a bottom-up perspective. In this approach, the installations involved in a domino effect are modeled as agents whereas the interactions among the installations (e.g., by means of heat radiation) are modeled via the basic rules of the agents. Application of the developed model to several case studies demonstrates the ability of the model not only in modeling higher-level domino effects and synergistic effects but also in accounting for temporal dependencies. The model can readily be applied to large-scale complicated cases. © 2017 Society for Risk Analysis.
Agent-Based Simulations for Project Management
NASA Technical Reports Server (NTRS)
White, J. Chris; Sholtes, Robert M.
2011-01-01
Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.
Optimization of space system development resources
NASA Astrophysics Data System (ADS)
Kosmann, William J.; Sarkani, Shahram; Mazzuchi, Thomas
2013-06-01
NASA has had a decades-long problem with cost growth during the development of space science missions. Numerous agency-sponsored studies have produced average mission level cost growths ranging from 23% to 77%. A new study of 26 historical NASA Science instrument set developments using expert judgment to reallocate key development resources has an average cost growth of 73.77%. Twice in history, a barter-based mechanism has been used to reallocate key development resources during instrument development. The mean instrument set development cost growth was -1.55%. Performing a bivariate inference on the means of these two distributions, there is statistical evidence to support the claim that using a barter-based mechanism to reallocate key instrument development resources will result in a lower expected cost growth than using the expert judgment approach. Agent-based discrete event simulation is the natural way to model a trade environment. A NetLogo agent-based barter-based simulation of science instrument development was created. The agent-based model was validated against the Cassini historical example, as the starting and ending instrument development conditions are available. The resulting validated agent-based barter-based science instrument resource reallocation simulation was used to perform 300 instrument development simulations, using barter to reallocate development resources. The mean cost growth was -3.365%. A bivariate inference on the means was performed to determine that additional significant statistical evidence exists to support a claim that using barter-based resource reallocation will result in lower expected cost growth, with respect to the historical expert judgment approach. Barter-based key development resource reallocation should work on spacecraft development as well as it has worked on instrument development. A new study of 28 historical NASA science spacecraft developments has an average cost growth of 46.04%. As barter-based key development resource reallocation has never been tried in a spacecraft development, no historical results exist, and a simulation of using that approach must be developed. The instrument development simulation should be modified to account for spacecraft development market participant differences. The resulting agent-based barter-based spacecraft resource reallocation simulation would then be used to determine if significant statistical evidence exists to prove a claim that using barter-based resource reallocation will result in lower expected cost growth.
Bryce, Richard A
2011-04-01
The ability to accurately predict the interaction of a ligand with its receptor is a key limitation in computer-aided drug design approaches such as virtual screening and de novo design. In this article, we examine current strategies for a physics-based approach to scoring of protein-ligand affinity, as well as outlining recent developments in force fields and quantum chemical techniques. We also consider advances in the development and application of simulation-based free energy methods to study protein-ligand interactions. Fuelled by recent advances in computational algorithms and hardware, there is the opportunity for increased integration of physics-based scoring approaches at earlier stages in computationally guided drug discovery. Specifically, we envisage increased use of implicit solvent models and simulation-based scoring methods as tools for computing the affinities of large virtual ligand libraries. Approaches based on end point simulations and reference potentials allow the application of more advanced potential energy functions to prediction of protein-ligand binding affinities. Comprehensive evaluation of polarizable force fields and quantum mechanical (QM)/molecular mechanical and QM methods in scoring of protein-ligand interactions is required, particularly in their ability to address challenging targets such as metalloproteins and other proteins that make highly polar interactions. Finally, we anticipate increasingly quantitative free energy perturbation and thermodynamic integration methods that are practical for optimization of hits obtained from screened ligand libraries.
Predicting pedestrian flow: a methodology and a proof of concept based on real-life data.
Davidich, Maria; Köster, Gerta
2013-01-01
Building a reliable predictive model of pedestrian motion is very challenging: Ideally, such models should be based on observations made in both controlled experiments and in real-world environments. De facto, models are rarely based on real-world observations due to the lack of available data; instead, they are largely based on intuition and, at best, literature values and laboratory experiments. Such an approach is insufficient for reliable simulations of complex real-life scenarios: For instance, our analysis of pedestrian motion under natural conditions at a major German railway station reveals that the values for free-flow velocities and the flow-density relationship differ significantly from widely used literature values. It is thus necessary to calibrate and validate the model against relevant real-life data to make it capable of reproducing and predicting real-life scenarios. In this work we aim at constructing such realistic pedestrian stream simulation. Based on the analysis of real-life data, we present a methodology that identifies key parameters and interdependencies that enable us to properly calibrate the model. The success of the approach is demonstrated for a benchmark model, a cellular automaton. We show that the proposed approach significantly improves the reliability of the simulation and hence the potential prediction accuracy. The simulation is validated by comparing the local density evolution of the measured data to that of the simulated data. We find that for our model the most sensitive parameters are: the source-target distribution of the pedestrian trajectories, the schedule of pedestrian appearances in the scenario and the mean free-flow velocity. Our results emphasize the need for real-life data extraction and analysis to enable predictive simulations.
An Evolutionary Optimization of the Refueling Simulation for a CANDU Reactor
NASA Astrophysics Data System (ADS)
Do, Q. B.; Choi, H.; Roh, G. H.
2006-10-01
This paper presents a multi-cycle and multi-objective optimization method for the refueling simulation of a 713 MWe Canada deuterium uranium (CANDU-6) reactor based on a genetic algorithm, an elitism strategy and a heuristic rule. The proposed algorithm searches for the optimal refueling patterns for a single cycle that maximizes the average discharge burnup, minimizes the maximum channel power and minimizes the change in the zone controller unit water fills while satisfying the most important safety-related neutronic parameters of the reactor core. The heuristic rule generates an initial population of individuals very close to a feasible solution and it reduces the computing time of the optimization process. The multi-cycle optimization is carried out based on a single cycle refueling simulation. The proposed approach was verified by a refueling simulation of a natural uranium CANDU-6 reactor for an operation period of 6 months at an equilibrium state and compared with the experience-based automatic refueling simulation and the generalized perturbation theory. The comparison has shown that the simulation results are consistent from each other and the proposed approach is a reasonable optimization method of the refueling simulation that controls all the safety-related parameters of the reactor core during the simulation
Rapid simulation of X-ray transmission imaging for baggage inspection via GPU-based ray-tracing
NASA Astrophysics Data System (ADS)
Gong, Qian; Stoian, Razvan-Ionut; Coccarelli, David S.; Greenberg, Joel A.; Vera, Esteban; Gehm, Michael E.
2018-01-01
We present a pipeline that rapidly simulates X-ray transmission imaging for arbitrary system architectures using GPU-based ray-tracing techniques. The purpose of the pipeline is to enable statistical analysis of threat detection in the context of airline baggage inspection. As a faster alternative to Monte Carlo methods, we adopt a deterministic approach for simulating photoelectric absorption-based imaging. The highly-optimized NVIDIA OptiX API is used to implement ray-tracing, greatly speeding code execution. In addition, we implement the first hierarchical representation structure to determine the interaction path length of rays traversing heterogeneous media described by layered polygons. The accuracy of the pipeline has been validated by comparing simulated data with experimental data collected using a heterogenous phantom and a laboratory X-ray imaging system. On a single computer, our approach allows us to generate over 400 2D transmission projections (125 × 125 pixels per frame) per hour for a bag packed with hundreds of everyday objects. By implementing our approach on cloud-based GPU computing platforms, we find that the same 2D projections of approximately 3.9 million bags can be obtained in a single day using 400 GPU instances, at a cost of only 0.001 per bag.
Calculation for simulation of archery goal value using a web camera and ultrasonic sensor
NASA Astrophysics Data System (ADS)
Rusjdi, Darma; Abdurrasyid, Wulandari, Dewi Arianti
2017-08-01
Development of the device simulator digital indoor archery-based embedded systems as a solution to the limitations of the field or open space is adequate, especially in big cities. Development of the device requires simulations to calculate the value of achieving the target based on the approach defined by the parabolic motion variable initial velocity and direction of motion of the arrow reaches the target. The simulator device should be complemented with an initial velocity measuring device using ultrasonic sensors and measuring direction of the target using a digital camera. The methodology uses research and development of application software from modeling and simulation approach. The research objective to create simulation applications calculating the value of the achievement of the target arrows. Benefits as a preliminary stage for the development of the simulator device of archery. Implementation of calculating the value of the target arrows into the application program generates a simulation game of archery that can be used as a reference development of the digital archery simulator in a room with embedded systems using ultrasonic sensors and web cameras. Applications developed with the simulation calculation comparing the outer radius of the circle produced a camera from a distance of three meters.
NASA Astrophysics Data System (ADS)
Barnes, Brian C.; Leiter, Kenneth W.; Becker, Richard; Knap, Jaroslaw; Brennan, John K.
2017-07-01
We describe the development, accuracy, and efficiency of an automation package for molecular simulation, the large-scale atomic/molecular massively parallel simulator (LAMMPS) integrated materials engine (LIME). Heuristics and algorithms employed for equation of state (EOS) calculation using a particle-based model of a molecular crystal, hexahydro-1,3,5-trinitro-s-triazine (RDX), are described in detail. The simulation method for the particle-based model is energy-conserving dissipative particle dynamics, but the techniques used in LIME are generally applicable to molecular dynamics simulations with a variety of particle-based models. The newly created tool set is tested through use of its EOS data in plate impact and Taylor anvil impact continuum simulations of solid RDX. The coarse-grain model results from LIME provide an approach to bridge the scales from atomistic simulations to continuum simulations.
Experimental measurements of motion cue effects on STOL approach tasks
NASA Technical Reports Server (NTRS)
Ringland, R. F.; Stapleford, R. L.
1972-01-01
An experimental program to investigate the effects of motion cues on STOL approach is presented. The simulator used was the Six-Degrees-of-Freedom Motion Simulator (S.01) at Ames Research Center of NASA which has ?2.7 m travel longitudinally and laterally and ?2.5 m travel vertically. Three major experiments, characterized as tracking tasks, were conducted under fixed and moving base conditions: (1) A simulated IFR approach of the Augmentor Wing Jet STOL Research Aircraft (AWJSRA), (2) a simulated VFR task with the same aircraft, and (3) a single-axis task having only linear acceleration as the motion cue. Tracking performance was measured in terms of the variances of several motion variables, pilot vehicle describing functions, and pilot commentary.
Engaging Workers in Simulation-Based E-Learning
ERIC Educational Resources Information Center
Slotte, Virpi; Herbert, Anne
2008-01-01
Purpose: The purpose of this paper is to evaluate learners' attitudes to the use of simulation-based e-learning as part of workplace learning when socially situated interaction and blended learning are specifically included in the instructional design. Design/methodology/approach: Responses to a survey questionnaire of 298 sales personnel were…
A parallel algorithm for switch-level timing simulation on a hypercube multiprocessor
NASA Technical Reports Server (NTRS)
Rao, Hariprasad Nannapaneni
1989-01-01
The parallel approach to speeding up simulation is studied, specifically the simulation of digital LSI MOS circuitry on the Intel iPSC/2 hypercube. The simulation algorithm is based on RSIM, an event driven switch-level simulator that incorporates a linear transistor model for simulating digital MOS circuits. Parallel processing techniques based on the concepts of Virtual Time and rollback are utilized so that portions of the circuit may be simulated on separate processors, in parallel for as large an increase in speed as possible. A partitioning algorithm is also developed in order to subdivide the circuit for parallel processing.
GPU-based efficient realistic techniques for bleeding and smoke generation in surgical simulators.
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-12-01
In actual surgery, smoke and bleeding due to cauterization processes provide important visual cues to the surgeon, which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated the effects of bleeding and smoke generation, they are not realistic due to the requirement of real-time performance. To be interactive, visual update must be performed at at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques, since other computationally intensive processes compete for the available Central Processing Unit (CPU) resources. In this study we developed a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators, which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. The smoke and bleeding simulation were implemented as part of a laparoscopic adjustable gastric banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur noticeable overhead. However, for smoke generation, an input/output (I/O) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited to VR-based surgical simulators. Copyright © 2010 John Wiley & Sons, Ltd.
GPU-based Efficient Realistic Techniques for Bleeding and Smoke Generation in Surgical Simulators
Halic, Tansel; Sankaranarayanan, Ganesh; De, Suvranu
2010-01-01
Background In actual surgery, smoke and bleeding due to cautery processes, provide important visual cues to the surgeon which have been proposed as factors in surgical skill assessment. While several virtual reality (VR)-based surgical simulators have incorporated effects of bleeding and smoke generation, they are not realistic due to the requirement of real time performance. To be interactive, visual update must be performed at least 30 Hz and haptic (touch) information must be refreshed at 1 kHz. Simulation of smoke and bleeding is, therefore, either ignored or simulated using highly simplified techniques since other computationally intensive processes compete for the available CPU resources. Methods In this work, we develop a novel low-cost method to generate realistic bleeding and smoke in VR-based surgical simulators which outsources the computations to the graphical processing unit (GPU), thus freeing up the CPU for other time-critical tasks. This method is independent of the complexity of the organ models in the virtual environment. User studies were performed using 20 subjects to determine the visual quality of the simulations compared to real surgical videos. Results The smoke and bleeding simulation were implemented as part of a Laparoscopic Adjustable Gastric Banding (LAGB) simulator. For the bleeding simulation, the original implementation using the shader did not incur in noticeable overhead. However, for smoke generation, an I/O (Input/Output) bottleneck was observed and two different methods were developed to overcome this limitation. Based on our benchmark results, a buffered approach performed better than a pipelined approach and could support up to 15 video streams in real time. Human subject studies showed that the visual realism of the simulations were as good as in real surgery (median rating of 4 on a 5-point Likert scale). Conclusions Based on the performance results and subject study, both bleeding and smoke simulations were concluded to be efficient, highly realistic and well suited in VR-based surgical simulators. PMID:20878651
NASA Technical Reports Server (NTRS)
Stupl, Jan; Faber, Nicolas; Foster, Cyrus; Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Nuttall, Andrew; Henze, Chris; Levit, Creon
2014-01-01
This paper provides an updated efficiency analysis of the LightForce space debris collision avoidance scheme. LightForce aims to prevent collisions on warning by utilizing photon pressure from ground based, commercial off the shelf lasers. Past research has shown that a few ground-based systems consisting of 10 kilowatt class lasers directed by 1.5 meter telescopes with adaptive optics could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. Our simulation approach utilizes the entire Two Line Element (TLE) catalogue in LEO for a given day as initial input. Least-squares fitting of a TLE time series is used for an improved orbit estimate. We then calculate the probability of collision for all LEO objects in the catalogue for a time step of the simulation. The conjunctions that exceed a threshold probability of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the probability of collision and evaluate the efficiency of the system. This paper describes new simulations with three updated aspects: 1) By utilizing a highly parallel simulation approach employing hundreds of processors, we have extended our analysis to a much broader dataset. The simulation time is extended to one year. 2) We analyze not only the efficiency of LightForce on conjunctions that naturally occur, but also take into account conjunctions caused by orbit perturbations due to LightForce engagements. 3) We use a new simulation approach that is regularly updating the LightForce engagement strategy, as it would be during actual operations. In this paper we present our simulation approach to parallelize the efficiency analysis, its computational performance and the resulting expected efficiency of the LightForce collision avoidance system. Results indicate that utilizing a network of four LightForce stations with 20 kilowatt lasers, 85% of all conjunctions with a probability of collision Pc > 10 (sup -6) can be mitigated.
Efficient rejection-based simulation of biochemical reactions with stochastic noise and delays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento
2014-10-07
We propose a new exact stochastic rejection-based simulation algorithm for biochemical reactions and extend it to systems with delays. Our algorithm accelerates the simulation by pre-computing reaction propensity bounds to select the next reaction to perform. Exploiting such bounds, we are able to avoid recomputing propensities every time a (delayed) reaction is initiated or finished, as is typically necessary in standard approaches. Propensity updates in our approach are still performed, but only infrequently and limited for a small number of reactions, saving computation time and without sacrificing exactness. We evaluate the performance improvement of our algorithm by experimenting with concretemore » biological models.« less
Galactic Cosmic Ray Simulator at the NASA Space Radiation Laboratory
NASA Technical Reports Server (NTRS)
Norbury, John W.; Slaba, Tony C.; Rusek, Adam
2015-01-01
The external Galactic Cosmic Ray (GCR) spectrum is significantly modified when it passes through spacecraft shielding and astronauts. One approach for simulating the GCR space radiation environment is to attempt to reproduce the unmodified, external GCR spectrum at a ground based accelerator. A possibly better approach would use the modified, shielded tissue spectrum, to select accelerator beams impinging on biological targets. NASA plans for implementation of a GCR simulator at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory will be discussed.
NASA Astrophysics Data System (ADS)
Hauptmann, S.; Bülk, M.; Schön, L.; Erbslöh, S.; Boorsma, K.; Grasso, F.; Kühn, M.; Cheng, P. W.
2014-12-01
Design load simulations for wind turbines are traditionally based on the blade- element-momentum theory (BEM). The BEM approach is derived from a simplified representation of the rotor aerodynamics and several semi-empirical correction models. A more sophisticated approach to account for the complex flow phenomena on wind turbine rotors can be found in the lifting-line free vortex wake method. This approach is based on a more physics based representation, especially for global flow effects. This theory relies on empirical correction models only for the local flow effects, which are associated with the boundary layer of the rotor blades. In this paper the lifting-line free vortex wake method is compared to a state- of-the-art BEM formulation with regard to aerodynamic and aeroelastic load simulations of the 5MW UpWind reference wind turbine. Different aerodynamic load situations as well as standardised design load cases that are sensitive to the aeroelastic modelling are evaluated in detail. This benchmark makes use of the AeroModule developed by ECN, which has been coupled to the multibody simulation code SIMPACK.
Simulation Modeling of Resilience Assessment in Indonesian Fertiliser Industry Supply Networks
NASA Astrophysics Data System (ADS)
Utami, I. D.; Holt, R. J.; McKay, A.
2018-01-01
Supply network resilience is a significant aspect in the performance of the Indonesian fertiliser industry. Decision makers use risk assessment and port management reports to evaluate the availability of infrastructure. An opportunity was identified to incorporate both types of data into an approach for the measurement of resilience. A framework, based on a synthesis of literature and interviews with industry practitioners, covering both social and technical factors is introduced. A simulation model was then built to allow managers to explore implications for resilience and predict levels of risk in different scenarios. Result of interview with respondens from Indonesian fertiliser industry indicated that the simulation model could be valuable in the assessment. This paper provides details of the simulation model for decision makers to explore levels of risk in supply networks. For practitioners, the model could be used by government to assess the current condition of supply networks in Indonesian industries. On the other hand, for academia, the approach provides a new application of agent-based models in research on supply network resilience and presents a real example of how agent-based modeling could be used as to support the assessment approach.
GRILLIX: a 3D turbulence code based on the flux-coordinate independent approach
NASA Astrophysics Data System (ADS)
Stegmeir, Andreas; Coster, David; Ross, Alexander; Maj, Omar; Lackner, Karl; Poli, Emanuele
2018-03-01
The GRILLIX code is presented with which plasma turbulence/transport in various geometries can be simulated in 3D. The distinguishing feature of the code is that it is based on the flux-coordinate independent approach (FCI) (Hariri and Ottaviani 2013 Comput. Phys. Commun. 184 2419; Stegmeir et al 2016 Comput. Phys. Commun. 198 139). Cylindrical or Cartesian grids are used on which perpendicular operators are discretised via standard finite difference methods and parallel operators via a field line tracing and interpolation procedure (field line map). This offers a very high flexibility with respect to geometry, especially a separatrix with X-point(s) or a magnetic axis can be treated easily in contrast to approaches which are based on field aligned coordinates and suffer from coordinate singularities. Aiming finally for simulation of edge and scrape-off layer (SOL) turbulence, an isothermal electrostatic drift-reduced Braginskii model (Zeiler et al 1997 Phys. Plasmas 4 2134) has been implemented in GRILLIX. We present the numerical approach, which is based on a toroidally staggered formulation of the FCI, we show verification of the code with the method of manufactured solutions and show a benchmark based on a TORPEX blob experiment, previously performed by several edge/SOL codes (Riva et al 2016 Plasma Phys. Control. Fusion 58 044005). Examples for slab, circular, limiter and diverted geometry are presented. Finally, the results show that the FCI approach in general and GRILLIX in particular are viable approaches in order to tackle simulation of edge/SOL turbulence in diverted geometry.
An Example-Based Brain MRI Simulation Framework.
He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L
2015-02-21
The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.
Biochemical simulations: stochastic, approximate stochastic and hybrid approaches.
Pahle, Jürgen
2009-01-01
Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem.
Biochemical simulations: stochastic, approximate stochastic and hybrid approaches
2009-01-01
Computer simulations have become an invaluable tool to study the sometimes counterintuitive temporal dynamics of (bio-)chemical systems. In particular, stochastic simulation methods have attracted increasing interest recently. In contrast to the well-known deterministic approach based on ordinary differential equations, they can capture effects that occur due to the underlying discreteness of the systems and random fluctuations in molecular numbers. Numerous stochastic, approximate stochastic and hybrid simulation methods have been proposed in the literature. In this article, they are systematically reviewed in order to guide the researcher and help her find the appropriate method for a specific problem. PMID:19151097
On simulations of rarefied vapor flows with condensation
NASA Astrophysics Data System (ADS)
Bykov, Nikolay; Gorbachev, Yuriy; Fyodorov, Stanislav
2018-05-01
Results of the direct simulation Monte Carlo of 1D spherical and 2D axisymmetric expansions into vacuum of condens-ing water vapor are presented. Two models based on the kinetic approach and the size-corrected classical nucleation theory are employed for simulations. The difference in obtained results is discussed and advantages of the kinetic approach in comparison with the modified classical theory are demonstrated. The impact of clusterization on flow parameters is observed when volume fraction of clusters in the expansion region exceeds 5%. Comparison of the simulation data with the experimental results demonstrates good agreement.
NASA Astrophysics Data System (ADS)
Fedulov, Boris N.; Safonov, Alexander A.; Sergeichev, Ivan V.; Ushakov, Andrey E.; Klenin, Yuri G.; Makarenko, Irina V.
2016-10-01
An application of composites for construction of subway brackets is a very effective approach to extend their lifetime. However, this approach involves the necessity to prevent process-induced distortions of the bracket due to thermal deformation and chemical shrinkage. At present study, a process simulation has been carried out to support the design of the production tooling. The simulation was based on the application of viscoelastic model for the resin. Simulation results were verified by comparison with results of manufacturing experiments. To optimize the bracket structure the strength analysis was carried out as well.
2011-12-01
Task Based Approach to Planning.” Paper 08F- SIW -033. In Proceed- ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability...Paper 06F- SIW -003. In Proceed- 2597 Blais ings of the Fall Simulation Interoperability Workshop. Simulation Interoperability Standards Organi...MSDL).” Paper 10S- SIW -003. In Proceedings of the Spring Simulation Interoperability Workshop. Simulation Interoperability Standards Organization
Simulation-based artifact correction (SBAC) for metrological computed tomography
NASA Astrophysics Data System (ADS)
Maier, Joscha; Leinweber, Carsten; Sawall, Stefan; Stoschus, Henning; Ballach, Frederic; Müller, Tobias; Hammer, Michael; Christoph, Ralf; Kachelrieß, Marc
2017-06-01
Computed tomography (CT) is a valuable tool for the metrolocical assessment of industrial components. However, the application of CT to the investigation of highly attenuating objects or multi-material components is often restricted by the presence of CT artifacts caused by beam hardening, x-ray scatter, off-focal radiation, partial volume effects or the cone-beam reconstruction itself. In order to overcome this limitation, this paper proposes an approach to calculate a correction term that compensates for the contribution of artifacts and thus enables an appropriate assessment of these components using CT. Therefore, we make use of computer simulations of the CT measurement process. Based on an appropriate model of the object, e.g. an initial reconstruction or a CAD model, two simulations are carried out. One simulation considers all physical effects that cause artifacts using dedicated analytic methods as well as Monte Carlo-based models. The other one represents an ideal CT measurement i.e. a measurement in parallel beam geometry with a monochromatic, point-like x-ray source and no x-ray scattering. Thus, the difference between these simulations is an estimate for the present artifacts and can be used to correct the acquired projection data or the corresponding CT reconstruction, respectively. The performance of the proposed approach is evaluated using simulated as well as measured data of single and multi-material components. Our approach yields CT reconstructions that are nearly free of artifacts and thereby clearly outperforms commonly used artifact reduction algorithms in terms of image quality. A comparison against tactile reference measurements demonstrates the ability of the proposed approach to increase the accuracy of the metrological assessment significantly.
TU-A-17A-02: In Memoriam of Ben Galkin: Virtual Tools for Validation of X-Ray Breast Imaging Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myers, K; Bakic, P; Abbey, C
2014-06-15
This symposium will explore simulation methods for the preclinical evaluation of novel 3D and 4D x-ray breast imaging systems – the subject of AAPM taskgroup TG234. Given the complex design of modern imaging systems, simulations offer significant advantages over long and costly clinical studies in terms of reproducibility, reduced radiation exposures, a known reference standard, and the capability for studying patient and disease subpopulations through appropriate choice of simulation parameters. Our focus will be on testing the realism of software anthropomorphic phantoms and virtual clinical trials tools developed for the optimization and validation of breast imaging systems. The symposium willmore » review the stateof- the-science, as well as the advantages and limitations of various approaches to testing realism of phantoms and simulated breast images. Approaches based upon the visual assessment of synthetic breast images by expert observers will be contrasted with approaches based upon comparing statistical properties between synthetic and clinical images. The role of observer models in the assessment of realism will be considered. Finally, an industry perspective will be presented, summarizing the role and importance of virtual tools and simulation methods in product development. The challenges and conditions that must be satisfied in order for computational modeling and simulation to play a significantly increased role in the design and evaluation of novel breast imaging systems will be addressed. Learning Objectives: Review the state-of-the science in testing realism of software anthropomorphic phantoms and virtual clinical trials tools; Compare approaches based upon the visual assessment by expert observers vs. the analysis of statistical properties of synthetic images; Discuss the role of observer models in the assessment of realism; Summarize the industry perspective to virtual methods for breast imaging.« less
Advanced EUV mask and imaging modeling
NASA Astrophysics Data System (ADS)
Evanschitzky, Peter; Erdmann, Andreas
2017-10-01
The exploration and optimization of image formation in partially coherent EUV projection systems with complex source shapes requires flexible, accurate, and efficient simulation models. This paper reviews advanced mask diffraction and imaging models for the highly accurate and fast simulation of EUV lithography systems, addressing important aspects of the current technical developments. The simulation of light diffraction from the mask employs an extended rigorous coupled wave analysis (RCWA) approach, which is optimized for EUV applications. In order to be able to deal with current EUV simulation requirements, several additional models are included in the extended RCWA approach: a field decomposition and a field stitching technique enable the simulation of larger complex structured mask areas. An EUV multilayer defect model including a database approach makes the fast and fully rigorous defect simulation and defect repair simulation possible. A hybrid mask simulation approach combining real and ideal mask parts allows the detailed investigation of the origin of different mask 3-D effects. The image computation is done with a fully vectorial Abbe-based approach. Arbitrary illumination and polarization schemes and adapted rigorous mask simulations guarantee a high accuracy. A fully vectorial sampling-free description of the pupil with Zernikes and Jones pupils and an optimized representation of the diffraction spectrum enable the computation of high-resolution images with high accuracy and short simulation times. A new pellicle model supports the simulation of arbitrary membrane stacks, pellicle distortions, and particles/defects on top of the pellicle. Finally, an extension for highly accurate anamorphic imaging simulations is included. The application of the models is demonstrated by typical use cases.
NASA Astrophysics Data System (ADS)
Sun, Xiaohang; Lee, Hoon Joo; Michielsen, Stephen; Wilusz, Eugene
2018-05-01
Although profiles of axisymmetric capillary bridges between two cylindrical fibers have been extensively studied, little research has been reported on capillary bridges under external forces such as the gravitational force. This is because external forces add significant complications to the Laplace-Young equation, making it difficult to predict drop profiles based on analytical approaches. In this paper, simulations of capillary bridges between two vertically stacked cylindrical fibers with gravitational effect taken into consideration are studied. The asymmetrical structure of capillary bridges that are hard to predict based on analytical approaches was studied via a numerical approach based on Surface Evolver (SE). The axial and the circumferential spreading of liquids on two identical fibers in the presence of gravitational effects are predicted to determine when the gravitational effects are significant or can be neglected. The effect of liquid volume, equilibrium contact angle, the distance between two fibers and fiber radii. The simulation results were verified by comparing them with experimental measurements. Based on SE simulations, curves representing the spreading of capillary bridges along the two cylindrical fibers were obtained. The gravitational effect was scaled based on the difference of the spreading on upper and lower fibers.
ERIC Educational Resources Information Center
Huet, Michael; Jacobs, David M.; Camachon, Cyril; Missenard, Olivier; Gray, Rob; Montagne, Gilles
2011-01-01
The present study reports two experiments in which a total of 20 participants without prior flight experience practiced the final approach phase in a fixed-base simulator. All participants received self-controlled concurrent feedback during 180 practice trials. Experiment 1 shows that participants learn more quickly under variable practice…
Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability
NASA Astrophysics Data System (ADS)
Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto
2016-06-01
Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.
Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento, Trento
Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximatemore » algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.« less
NASA Astrophysics Data System (ADS)
Dong, Gangqi; Zhu, Z. H.
2016-04-01
This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.
A Stigmergy Approach for Open Source Software Developer Community Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xiaohui; Beaver, Justin M; Potok, Thomas E
2009-01-01
The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agentmore » behaviors selection probability.« less
Understanding the kinetic mechanism of RNA single base pair formation
Xu, Xiaojun; Yu, Tao; Chen, Shi-Jie
2016-01-01
RNA functions are intrinsically tied to folding kinetics. The most elementary step in RNA folding is the closing and opening of a base pair. Understanding this elementary rate process is the basis for RNA folding kinetics studies. Previous studies mostly focused on the unfolding of base pairs. Here, based on a hybrid approach, we investigate the folding process at level of single base pairing/stacking. The study, which integrates molecular dynamics simulation, kinetic Monte Carlo simulation, and master equation methods, uncovers two alternative dominant pathways: Starting from the unfolded state, the nucleotide backbone first folds to the native conformation, followed by subsequent adjustment of the base conformation. During the base conformational rearrangement, the backbone either retains the native conformation or switches to nonnative conformations in order to lower the kinetic barrier for base rearrangement. The method enables quantification of kinetic partitioning among the different pathways. Moreover, the simulation reveals several intriguing ion binding/dissociation signatures for the conformational changes. Our approach may be useful for developing a base pair opening/closing rate model. PMID:26699466
A method for the computational modeling of the physics of heart murmurs
NASA Astrophysics Data System (ADS)
Seo, Jung Hee; Bakhshaee, Hani; Garreau, Guillaume; Zhu, Chi; Andreou, Andreas; Thompson, William R.; Mittal, Rajat
2017-05-01
A computational method for direct simulation of the generation and propagation of blood flow induced sounds is proposed. This computational hemoacoustic method is based on the immersed boundary approach and employs high-order finite difference methods to resolve wave propagation and scattering accurately. The current method employs a two-step, one-way coupled approach for the sound generation and its propagation through the tissue. The blood flow is simulated by solving the incompressible Navier-Stokes equations using the sharp-interface immersed boundary method, and the equations corresponding to the generation and propagation of the three-dimensional elastic wave corresponding to the murmur are resolved with a high-order, immersed boundary based, finite-difference methods in the time-domain. The proposed method is applied to a model problem of aortic stenosis murmur and the simulation results are verified and validated by comparing with known solutions as well as experimental measurements. The murmur propagation in a realistic model of a human thorax is also simulated by using the computational method. The roles of hemodynamics and elastic wave propagation on the murmur are discussed based on the simulation results.
Using cognitive task analysis to develop simulation-based training for medical tasks.
Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette
2013-10-01
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Fault displacement hazard assessment for nuclear installations based on IAEA safety standards
NASA Astrophysics Data System (ADS)
Fukushima, Y.
2016-12-01
In the IAEA Safety NS-R-3, surface fault displacement hazard assessment (FDHA) is required for the siting of nuclear installations. If any capable faults exist in the candidate site, IAEA recommends the consideration of alternative sites. However, due to the progress in palaeoseismological investigations, capable faults may be found in existing site. In such a case, IAEA recommends to evaluate the safety using probabilistic FDHA (PFDHA), which is an empirical approach based on still quite limited database. Therefore a basic and crucial improvement is to increase the database. In 2015, IAEA produced a TecDoc-1767 on Palaeoseismology as a reference for the identification of capable faults. Another IAEA Safety Report 85 on ground motion simulation based on fault rupture modelling provides an annex introducing recent PFDHAs and fault displacement simulation methodologies. The IAEA expanded the project of FDHA for the probabilistic approach and the physics based fault rupture modelling. The first approach needs a refinement of the empirical methods by building a world wide database, and the second approach needs to shift from kinematic to the dynamic scheme. Both approaches can complement each other, since simulated displacement can fill the gap of a sparse database and geological observations can be useful to calibrate the simulations. The IAEA already supported a workshop in October 2015 to discuss the existing databases with the aim of creating a common worldwide database. A consensus of a unified database was reached. The next milestone is to fill the database with as many fault rupture data sets as possible. Another IAEA work group had a WS in November 2015 to discuss the state-of-the-art PFDHA as well as simulation methodologies. Two groups jointed a consultancy meeting in February 2016, shared information, identified issues, discussed goals and outputs, and scheduled future meetings. Now we may aim at coordinating activities for the whole FDHA tasks jointly.
Improving the performance of surgery-based clinical pathways: a simulation-optimization approach.
Ozcan, Yasar A; Tànfani, Elena; Testi, Angela
2017-03-01
This paper aims to improve the performance of clinical processes using clinical pathways (CPs). The specific goal of this research is to develop a decision support tool, based on a simulation-optimization approach, which identify the proper adjustment and alignment of resources to achieve better performance for both the patients and the health-care facility. When multiple perspectives are present in a decision problem, critical issues arise and often require the balancing of goals. In our approach, meeting patients' clinical needs in a timely manner, and to avoid worsening of clinical conditions, we assess the level of appropriate resources. The simulation-optimization model seeks and evaluates alternative resource configurations aimed at balancing the two main objectives-meeting patient needs and optimal utilization of beds and operating rooms.Using primary data collected at a Department of Surgery of a public hospital located in Genoa, Italy. The simulation-optimization modelling approach in this study has been applied to evaluate the thyroid surgical treatment together with the other surgery-based CPs. The low rate of bed utilization and the long elective waiting lists of the specialty under study indicates that the wards were oversized while the operating room capacity was the bottleneck of the system. The model enables hospital managers determine which objective has to be given priority, as well as the corresponding opportunity costs.
Lin, Yu-Pin; Chu, Hone-Jay; Huang, Yu-Long; Tang, Chia-Hsi; Rouhani, Shahrokh
2011-06-01
This study develops a stratified conditional Latin hypercube sampling (scLHS) approach for multiple, remotely sensed, normalized difference vegetation index (NDVI) images. The objective is to sample, monitor, and delineate spatiotemporal landscape changes, including spatial heterogeneity and variability, in a given area. The scLHS approach, which is based on the variance quadtree technique (VQT) and the conditional Latin hypercube sampling (cLHS) method, selects samples in order to delineate landscape changes from multiple NDVI images. The images are then mapped for calibration and validation by using sequential Gaussian simulation (SGS) with the scLHS selected samples. Spatial statistical results indicate that in terms of their statistical distribution, spatial distribution, and spatial variation, the statistics and variograms of the scLHS samples resemble those of multiple NDVI images more closely than those of cLHS and VQT samples. Moreover, the accuracy of simulated NDVI images based on SGS with scLHS samples is significantly better than that of simulated NDVI images based on SGS with cLHS samples and VQT samples, respectively. However, the proposed approach efficiently monitors the spatial characteristics of landscape changes, including the statistics, spatial variability, and heterogeneity of NDVI images. In addition, SGS with the scLHS samples effectively reproduces spatial patterns and landscape changes in multiple NDVI images.
NASA Astrophysics Data System (ADS)
Jezzine, Karim; Imperiale, Alexandre; Demaldent, Edouard; Le Bourdais, Florian; Calmon, Pierre; Dominguez, Nicolas
2018-04-01
Models for the simulation of ultrasonic inspections of flat and curved plate-like composite structures, as well as stiffeners, are available in the CIVA-COMPOSITE module released in 2016. A first modelling approach using a ray-based model is able to predict the ultrasonic propagation in an anisotropic effective medium obtained after having homogenized the composite laminate. Fast 3D computations can be performed on configurations featuring delaminations, flat bottom holes or inclusions for example. In addition, computations on ply waviness using this model will be available in CIVA 2017. Another approach is proposed in the CIVA-COMPOSITE module. It is based on the coupling of CIVA ray-based model and a finite difference scheme in time domain (FDTD) developed by AIRBUS. The ray model handles the ultrasonic propagation between the transducer and the FDTD computation zone that surrounds the composite part. In this way, the computational efficiency is preserved and the ultrasound scattering by the composite structure can be predicted. Alternatively, a high order finite element approach is currently developed at CEA but not yet integrated in CIVA. The advantages of this approach will be discussed and first simulation results on Carbon Fiber Reinforced Polymers (CFRP) will be shown. Finally, the application of these modelling tools to the construction of metamodels is discussed.
Mechanics Model for Simulating RC Hinges under Reversed Cyclic Loading
Shukri, Ahmad Azim; Visintin, Phillip; Oehlers, Deric J.; Jumaat, Mohd Zamin
2016-01-01
Describing the moment rotation (M/θ) behavior of reinforced concrete (RC) hinges is essential in predicting the behavior of RC structures under severe loadings, such as under cyclic earthquake motions and blast loading. The behavior of RC hinges is defined by localized slip or partial interaction (PI) behaviors in both the tension and compression region. In the tension region, slip between the reinforcement and the concrete defines crack spacing, crack opening and closing, and tension stiffening. While in the compression region, slip along concrete to concrete interfaces defines the formation and failure of concrete softening wedges. Being strain-based, commonly-applied analysis techniques, such as the moment curvature approach, cannot directly simulate these PI behaviors because they are localized and displacement based. Therefore, strain-based approaches must resort to empirical factors to define behaviors, such as tension stiffening and concrete softening hinge lengths. In this paper, a displacement-based segmental moment rotation approach, which directly simulates the partial interaction behaviors in both compression and tension, is developed for predicting the M/θ response of an RC beam hinge under cyclic loading. Significantly, in order to develop the segmental approach, a partial interaction model to predict the tension stiffening load slip relationship between the reinforcement and the concrete is developed. PMID:28773430
Mechanics Model for Simulating RC Hinges under Reversed Cyclic Loading.
Shukri, Ahmad Azim; Visintin, Phillip; Oehlers, Deric J; Jumaat, Mohd Zamin
2016-04-22
Describing the moment rotation (M/θ) behavior of reinforced concrete (RC) hinges is essential in predicting the behavior of RC structures under severe loadings, such as under cyclic earthquake motions and blast loading. The behavior of RC hinges is defined by localized slip or partial interaction (PI) behaviors in both the tension and compression region. In the tension region, slip between the reinforcement and the concrete defines crack spacing, crack opening and closing, and tension stiffening. While in the compression region, slip along concrete to concrete interfaces defines the formation and failure of concrete softening wedges. Being strain-based, commonly-applied analysis techniques, such as the moment curvature approach, cannot directly simulate these PI behaviors because they are localized and displacement based. Therefore, strain-based approaches must resort to empirical factors to define behaviors, such as tension stiffening and concrete softening hinge lengths. In this paper, a displacement-based segmental moment rotation approach, which directly simulates the partial interaction behaviors in both compression and tension, is developed for predicting the M/θ response of an RC beam hinge under cyclic loading. Significantly, in order to develop the segmental approach, a partial interaction model to predict the tension stiffening load slip relationship between the reinforcement and the concrete is developed.
On the use of reverse Brownian motion to accelerate hybrid simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bakarji, Joseph; Tartakovsky, Daniel M., E-mail: tartakovsky@stanford.edu
Multiscale and multiphysics simulations are two rapidly developing fields of scientific computing. Efficient coupling of continuum (deterministic or stochastic) constitutive solvers with their discrete (stochastic, particle-based) counterparts is a common challenge in both kinds of simulations. We focus on interfacial, tightly coupled simulations of diffusion that combine continuum and particle-based solvers. The latter employs the reverse Brownian motion (rBm), a Monte Carlo approach that allows one to enforce inhomogeneous Dirichlet, Neumann, or Robin boundary conditions and is trivially parallelizable. We discuss numerical approaches for improving the accuracy of rBm in the presence of inhomogeneous Neumann boundary conditions and alternative strategiesmore » for coupling the rBm solver with its continuum counterpart. Numerical experiments are used to investigate the convergence, stability, and computational efficiency of the proposed hybrid algorithm.« less
New approach to wireless data communication in a propagation environment
NASA Astrophysics Data System (ADS)
Hunek, Wojciech P.; Majewski, Paweł
2017-10-01
This paper presents a new idea of perfect signal reconstruction in multivariable wireless communications systems including a different number of transmitting and receiving antennas. The proposed approach is based on the polynomial matrix S-inverse associated with Smith factorization. Crucially, the above mentioned inverse implements the so-called degrees of freedom. It has been confirmed by simulation study that the degrees of freedom allow to minimalize the negative impact of the propagation environment in terms of increasing the robustness of whole signal reconstruction process. Now, the parasitic drawbacks in form of dynamic ISI and ICI effects can be eliminated in framework described by polynomial calculus. Therefore, the new method guarantees not only reducing the financial impact but, more importantly, provides potentially the lower consumption energy systems than other classical ones. In order to show the potential of new approach, the simulation studies were performed by author's simulator based on well-known OFDM technique.
NASA Technical Reports Server (NTRS)
Ranatunga, Vipul; Bednarcyk, Brett A.; Arnold, Steven M.
2010-01-01
A method for performing progressive damage modeling in composite materials and structures based on continuum level interfacial displacement discontinuities is presented. The proposed method enables the exponential evolution of the interfacial compliance, resulting in unloading of the tractions at the interface after delamination or failure occurs. In this paper, the proposed continuum displacement discontinuity model has been used to simulate failure within both isotropic and orthotropic materials efficiently and to explore the possibility of predicting the crack path, therein. Simulation results obtained from Mode-I and Mode-II fracture compare the proposed approach with the cohesive element approach and Virtual Crack Closure Techniques (VCCT) available within the ABAQUS (ABAQUS, Inc.) finite element software. Furthermore, an eccentrically loaded 3-point bend test has been simulated with the displacement discontinuity model, and the resulting crack path prediction has been compared with a prediction based on the extended finite element model (XFEM) approach.
Simulation of Smart Home Activity Datasets
Synnott, Jonathan; Nugent, Chris; Jeffers, Paul
2015-01-01
A globally ageing population is resulting in an increased prevalence of chronic conditions which affect older adults. Such conditions require long-term care and management to maximize quality of life, placing an increasing strain on healthcare resources. Intelligent environments such as smart homes facilitate long-term monitoring of activities in the home through the use of sensor technology. Access to sensor datasets is necessary for the development of novel activity monitoring and recognition approaches. Access to such datasets is limited due to issues such as sensor cost, availability and deployment time. The use of simulated environments and sensors may address these issues and facilitate the generation of comprehensive datasets. This paper provides a review of existing approaches for the generation of simulated smart home activity datasets, including model-based approaches and interactive approaches which implement virtual sensors, environments and avatars. The paper also provides recommendation for future work in intelligent environment simulation. PMID:26087371
NASA Astrophysics Data System (ADS)
Andrade, Xavier; Strubbe, David; De Giovannini, Umberto; Larsen, Ask Hjorth; Oliveira, Micael J. T.; Alberdi-Rodriguez, Joseba; Varas, Alejandro; Theophilou, Iris; Helbig, Nicole; Verstraete, Matthieu J.; Stella, Lorenzo; Nogueira, Fernando; Aspuru-Guzik, Alán; Castro, Alberto; Marques, Miguel A. L.; Rubio, Angel
Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schr\\"odinger equation for low-dimensionality systems.
Simulation of Smart Home Activity Datasets.
Synnott, Jonathan; Nugent, Chris; Jeffers, Paul
2015-06-16
A globally ageing population is resulting in an increased prevalence of chronic conditions which affect older adults. Such conditions require long-term care and management to maximize quality of life, placing an increasing strain on healthcare resources. Intelligent environments such as smart homes facilitate long-term monitoring of activities in the home through the use of sensor technology. Access to sensor datasets is necessary for the development of novel activity monitoring and recognition approaches. Access to such datasets is limited due to issues such as sensor cost, availability and deployment time. The use of simulated environments and sensors may address these issues and facilitate the generation of comprehensive datasets. This paper provides a review of existing approaches for the generation of simulated smart home activity datasets, including model-based approaches and interactive approaches which implement virtual sensors, environments and avatars. The paper also provides recommendation for future work in intelligent environment simulation.
A Model-Based Joint Identification of Differentially Expressed Genes and Phenotype-Associated Genes
Seo, Minseok; Shin, Su-kyung; Kwon, Eun-Young; Kim, Sung-Eun; Bae, Yun-Jung; Lee, Seungyeoun; Sung, Mi-Kyung; Choi, Myung-Sook; Park, Taesung
2016-01-01
Over the last decade, many analytical methods and tools have been developed for microarray data. The detection of differentially expressed genes (DEGs) among different treatment groups is often a primary purpose of microarray data analysis. In addition, association studies investigating the relationship between genes and a phenotype of interest such as survival time are also popular in microarray data analysis. Phenotype association analysis provides a list of phenotype-associated genes (PAGs). However, it is sometimes necessary to identify genes that are both DEGs and PAGs. We consider the joint identification of DEGs and PAGs in microarray data analyses. The first approach we used was a naïve approach that detects DEGs and PAGs separately and then identifies the genes in an intersection of the list of PAGs and DEGs. The second approach we considered was a hierarchical approach that detects DEGs first and then chooses PAGs from among the DEGs or vice versa. In this study, we propose a new model-based approach for the joint identification of DEGs and PAGs. Unlike the previous two-step approaches, the proposed method identifies genes simultaneously that are DEGs and PAGs. This method uses standard regression models but adopts different null hypothesis from ordinary regression models, which allows us to perform joint identification in one-step. The proposed model-based methods were evaluated using experimental data and simulation studies. The proposed methods were used to analyze a microarray experiment in which the main interest lies in detecting genes that are both DEGs and PAGs, where DEGs are identified between two diet groups and PAGs are associated with four phenotypes reflecting the expression of leptin, adiponectin, insulin-like growth factor 1, and insulin. Model-based approaches provided a larger number of genes, which are both DEGs and PAGs, than other methods. Simulation studies showed that they have more power than other methods. Through analysis of data from experimental microarrays and simulation studies, the proposed model-based approach was shown to provide a more powerful result than the naïve approach and the hierarchical approach. Since our approach is model-based, it is very flexible and can easily handle different types of covariates. PMID:26964035
Physics-Based Stimulation for Night Vision Goggle Simulation
2006-11-01
a CRT display system can produce darker black level than displays based on digital light processing (DLP) or liquid crystal technologies. It should...The general form of the bucket equation for any gun (color) is as follows: (3) n n n n r MnRp f MxR MnR ⎛ ⎞− = ⎜ ⎟−⎝ ⎠ Equation 3 General...simulate rendering approach, we began by testing the bucket rendering approach already utilized by SensorHost: (10) n n n n r MnRp f MxR MnR
Space construction base control system
NASA Technical Reports Server (NTRS)
Kaczynski, R. F.
1979-01-01
Several approaches for an attitude control system are studied and developed for a large space construction base that is structurally flexible. Digital simulations were obtained using the following techniques: (1) the multivariable Nyquist array method combined with closed loop pole allocation, (2) the linear quadratic regulator method. Equations for the three-axis simulation using the multilevel control method were generated and are presented. Several alternate control approaches are also described. A technique is demonstrated for obtaining the dynamic structural properties of a vehicle which is constructed of two or more submodules of known dynamic characteristics.
A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation
Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas
2011-01-01
High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089
Hotspot detection using image pattern recognition based on higher-order local auto-correlation
NASA Astrophysics Data System (ADS)
Maeda, Shimon; Matsunawa, Tetsuaki; Ogawa, Ryuji; Ichikawa, Hirotaka; Takahata, Kazuhiro; Miyairi, Masahiro; Kotani, Toshiya; Nojima, Shigeki; Tanaka, Satoshi; Nakagawa, Kei; Saito, Tamaki; Mimotogi, Shoji; Inoue, Soichi; Nosato, Hirokazu; Sakanashi, Hidenori; Kobayashi, Takumi; Murakawa, Masahiro; Higuchi, Tetsuya; Takahashi, Eiichi; Otsu, Nobuyuki
2011-04-01
Below 40nm design node, systematic variation due to lithography must be taken into consideration during the early stage of design. So far, litho-aware design using lithography simulation models has been widely applied to assure that designs are printed on silicon without any error. However, the lithography simulation approach is very time consuming, and under time-to-market pressure, repetitive redesign by this approach may result in the missing of the market window. This paper proposes a fast hotspot detection support method by flexible and intelligent vision system image pattern recognition based on Higher-Order Local Autocorrelation. Our method learns the geometrical properties of the given design data without any defects as normal patterns, and automatically detects the design patterns with hotspots from the test data as abnormal patterns. The Higher-Order Local Autocorrelation method can extract features from the graphic image of design pattern, and computational cost of the extraction is constant regardless of the number of design pattern polygons. This approach can reduce turnaround time (TAT) dramatically only on 1CPU, compared with the conventional simulation-based approach, and by distributed processing, this has proven to deliver linear scalability with each additional CPU.
NASA Technical Reports Server (NTRS)
Phatak, A. V.
1980-01-01
A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.
NASA Astrophysics Data System (ADS)
Kerschbaum, M.; Hopmann, C.
2016-06-01
The computationally efficient simulation of the progressive damage behaviour of continuous fibre reinforced plastics is still a challenging task with currently available computer aided engineering methods. This paper presents an original approach for an energy based continuum damage model which accounts for stress-/strain nonlinearities, transverse and shear stress interaction phenomena, quasi-plastic shear strain components, strain rate effects, regularised damage evolution and consideration of load reversal effects. The physically based modelling approach enables experimental determination of all parameters on ply level to avoid expensive inverse analysis procedures. The modelling strategy, implementation and verification of this model using commercially available explicit finite element software are detailed. The model is then applied to simulate the impact and penetration of carbon fibre reinforced cross-ply specimens with variation of the impact speed. The simulation results show that the presented approach enables a good representation of the force-/displacement curves and especially well agreement with the experimentally observed fracture patterns. In addition, the mesh dependency of the results were assessed for one impact case showing only very little change of the simulation results which emphasises the general applicability of the presented method.
Docherty, Charles; Hoy, Derek; Topp, Helena; Trinder, Kathryn
2004-01-01
This paper details the results of the first phase of a project that used eLearning to support students' learning within a simulated environment. The locus was a purpose built Clinical Simulation Laboratory (CSL) where the School's newly adopted philosophy of Problem Based Learning (PBL) was challenged through lecturers reverting to traditional teaching methods. The solution, a student-centred, problem-based approach to the acquisition of clinical skills was developed using learning objects embedded within web pages that substituted for lecturers providing instruction and demonstration. This allowed lecturers to retain their facilitator role, and encouraged students to explore, analyse and make decisions within the safety of a clinical simulation. Learning was enhanced through network communications and reflection on video performances of self and others. Evaluations were positive, students demonstrating increased satisfaction with PBL, improved performance in exams, and increased self-efficacy in the performance of nursing activities. These results indicate that an elearning approach can support PBL in delivering a student centred learning experience.
Bayramzadeh, Sara; Joseph, Anjali; Allison, David; Shultz, Jonas; Abernathy, James
2018-07-01
This paper describes the process and tools developed as part of a multidisciplinary collaborative simulation-based approach for iterative design and evaluation of operating room (OR) prototypes. Full-scale physical mock-ups of healthcare spaces offer an opportunity to actively communicate with and to engage multidisciplinary stakeholders in the design process. While mock-ups are increasingly being used in healthcare facility design projects, they are rarely evaluated in a manner to support active user feedback and engagement. Researchers and architecture students worked closely with clinicians and architects to develop OR design prototypes and engaged clinical end-users in simulated scenarios. An evaluation toolkit was developed to compare design prototypes. The mock-up evaluation helped the team make key decisions about room size, location of OR table, intra-room zoning, and doors location. Structured simulation based mock-up evaluations conducted in the design process can help stakeholders visualize their future workspace and provide active feedback. Copyright © 2018 Elsevier Ltd. All rights reserved.
Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.
Housh, Mashor; Ohar, Ziv
2017-03-01
The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.
Process simulation and dynamic control for marine oily wastewater treatment using UV irradiation.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Li, Pu
2015-09-15
UV irradiation and advanced oxidation processes have been recently regarded as promising solutions in removing polycyclic aromatic hydrocarbons (PAHs) from marine oily wastewater. However, such treatment methods are generally not sufficiently understood in terms of reaction mechanisms, process simulation and process control. These deficiencies can drastically hinder their application in shipping and offshore petroleum industries which produce bilge/ballast water and produced water as the main streams of marine oily wastewater. In this study, the factorial design of experiment was carried out to investigate the degradation mechanism of a typical PAH, namely naphthalene, under UV irradiation in seawater. Based on the experimental results, a three-layer feed-forward artificial neural network simulation model was developed to simulate the treatment process and to forecast the removal performance. A simulation-based dynamic mixed integer nonlinear programming (SDMINP) approach was then proposed to intelligently control the treatment process by integrating the developed simulation model, genetic algorithm and multi-stage programming. The applicability and effectiveness of the developed approach were further tested though a case study. The experimental results showed that the influences of fluence rate and temperature on the removal of naphthalene were greater than those of salinity and initial concentration. The developed simulation model could well predict the UV-induced removal process under varying conditions. The case study suggested that the SDMINP approach, with the aid of the multi-stage control strategy, was able to significantly reduce treatment cost when comparing to the traditional single-stage process optimization. The developed approach and its concept/framework have high potential of applicability in other environmental fields where a treatment process is involved and experimentation and modeling are used for process simulation and control. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Osseus platform: a prototype for advanced web-based distributed simulation
NASA Astrophysics Data System (ADS)
Franceschini, Derrick; Riecken, Mark
2016-05-01
Recent technological advances in web-based distributed computing and database technology have made possible a deeper and more transparent integration of some modeling and simulation applications. Despite these advances towards true integration of capabilities, disparate systems, architectures, and protocols will remain in the inventory for some time to come. These disparities present interoperability challenges for distributed modeling and simulation whether the application is training, experimentation, or analysis. Traditional approaches call for building gateways to bridge between disparate protocols and retaining interoperability specialists. Challenges in reconciling data models also persist. These challenges and their traditional mitigation approaches directly contribute to higher costs, schedule delays, and frustration for the end users. Osseus is a prototype software platform originally funded as a research project by the Defense Modeling & Simulation Coordination Office (DMSCO) to examine interoperability alternatives using modern, web-based technology and taking inspiration from the commercial sector. Osseus provides tools and services for nonexpert users to connect simulations, targeting the time and skillset needed to successfully connect disparate systems. The Osseus platform presents a web services interface to allow simulation applications to exchange data using modern techniques efficiently over Local or Wide Area Networks. Further, it provides Service Oriented Architecture capabilities such that finer granularity components such as individual models can contribute to simulation with minimal effort.
The View of Scientific Inquiry Conveyed by Simulation-Based Virtual Laboratories
ERIC Educational Resources Information Center
Chen, Sufen
2010-01-01
With an increasing number of studies evincing the effectiveness of simulation-based virtual laboratories (VLs), researchers have discussed replacing traditional laboratories. However, the approach of doing science endorsed by VLs has not been carefully examined. A survey of 233 online VLs revealed that hypothetico-deductive (HD) logic prevails in…
The Validity of Computer Audits of Simulated Cases Records.
ERIC Educational Resources Information Center
Rippey, Robert M.; And Others
This paper describes the implementation of a computer-based approach to scoring open-ended problem lists constructed to evaluate student and practitioner clinical judgment from real or simulated records. Based on 62 previously administered and scored problem lists, the program was written in BASIC for a Heathkit H11A computer (equivalent to DEC…
Examining Reactions to Employer Information Using a Simulated Web-Based Job Fair
ERIC Educational Resources Information Center
Highhouse, Scott; Stanton, Jeffrey M.; Reeve, Charlie L.
2004-01-01
The approach taken in the present investigation was to examine reactions to positive and negative employer information by eliciting online (i.e., moment-to-moment) reactions in a simulated computer-based job fair. Reactions to positive and negative information commonly reveal a negatively biased asymmetry. Positively biased asymmetries have been…
Designing Online Problem Representation Engine for Conceptual Change
ERIC Educational Resources Information Center
Lee, Chwee Beng; Ling, Keck Voon
2012-01-01
Purpose: This paper aims to describe the web-based scaffold dynamic simulation system (PRES-on) designed for pre-service teachers. Design/methodology/approach: The paper describes the initial design of a web-based scaffold dynamic simulation system (PRES-on) as a cognitive tool for learners to represent problems. For the widespread use of the…
Specifying and Refining a Measurement Model for a Simulation-Based Assessment. CSE Report 619.
ERIC Educational Resources Information Center
Levy, Roy; Mislevy, Robert J.
2004-01-01
The challenges of modeling students' performance in simulation-based assessments include accounting for multiple aspects of knowledge and skill that arise in different situations and the conditional dependencies among multiple aspects of performance in a complex assessment. This paper describes a Bayesian approach to modeling and estimating…
NASA Astrophysics Data System (ADS)
Rossman, Nathan R.; Zlotnik, Vitaly A.; Rowe, Clinton M.
2018-05-01
The feasibility of a hydrogeological modeling approach to simulate several thousand shallow groundwater-fed lakes and wetlands without explicitly considering their connection with groundwater is investigated at the regional scale ( 40,000 km2) through an application in the semi-arid Nebraska Sand Hills (NSH), USA. Hydraulic heads are compared to local land-surface elevations from a digital elevation model (DEM) within a geographic information system to assess locations of lakes and wetlands. The water bodies are inferred where hydraulic heads exceed, or are above a certain depth below, the land surface. Numbers of lakes and/or wetlands are determined via image cluster analysis applied to the same 30-m grid as the DEM after interpolating both simulated and estimated heads. The regional water-table map was used for groundwater model calibration, considering MODIS-based net groundwater recharge data. Resulting values of simulated total baseflow to interior streams are within 1% of observed values. Locations, areas, and numbers of simulated lakes and wetlands are compared with Landsat 2005 survey data and with areas of lakes from a 1979-1980 Landsat survey and the National Hydrography Dataset. This simplified process-based modeling approach avoids the need for field-based morphology or water-budget data from individual lakes or wetlands, or determination of lake-groundwater exchanges, yet it reproduces observed lake-wetland characteristics at regional groundwater management scales. A better understanding of the NSH hydrogeology is attained, and the approach shows promise for use in simulations of groundwater-fed lake and wetland characteristics in other large groundwater systems.
Ebel, B.A.; Mirus, B.B.; Heppner, C.S.; VanderKwaak, J.E.; Loague, K.
2009-01-01
Distributed hydrologic models capable of simulating fully-coupled surface water and groundwater flow are increasingly used to examine problems in the hydrologic sciences. Several techniques are currently available to couple the surface and subsurface; the two most frequently employed approaches are first-order exchange coefficients (a.k.a., the surface conductance method) and enforced continuity of pressure and flux at the surface-subsurface boundary condition. The effort reported here examines the parameter sensitivity of simulated hydrologic response for the first-order exchange coefficients at a well-characterized field site using the fully coupled Integrated Hydrology Model (InHM). This investigation demonstrates that the first-order exchange coefficients can be selected such that the simulated hydrologic response is insensitive to the parameter choice, while simulation time is considerably reduced. Alternatively, the ability to choose a first-order exchange coefficient that intentionally decouples the surface and subsurface facilitates concept-development simulations to examine real-world situations where the surface-subsurface exchange is impaired. While the parameters comprising the first-order exchange coefficient cannot be directly estimated or measured, the insensitivity of the simulated flow system to these parameters (when chosen appropriately) combined with the ability to mimic actual physical processes suggests that the first-order exchange coefficient approach can be consistent with a physics-based framework. Copyright ?? 2009 John Wiley & Sons, Ltd.
Procedural training and assessment of competency utilizing simulation.
Sawyer, Taylor; Gray, Megan M
2016-11-01
This review examines the current environment of neonatal procedural learning, describes an updated model of skills training, defines the role of simulation in assessing competency, and discusses potential future directions for simulation-based competency assessment. In order to maximize impact, simulation-based procedural training programs should follow a standardized and evidence-based approach to designing and evaluating educational activities. Simulation can be used to facilitate the evaluation of competency, but must incorporate validated assessment tools to ensure quality and consistency. True competency evaluation cannot be accomplished with simulation alone: competency assessment must also include evaluations of procedural skill during actual clinical care. Future work in this area is needed to measure and track clinically meaningful patient outcomes resulting from simulation-based training, examine the use of simulation to assist physicians undergoing re-entry to practice, and to examine the use of procedural skills simulation as part of a maintenance of competency and life-long learning. Copyright © 2016 Elsevier Inc. All rights reserved.
Greenberg, David A.
2011-01-01
Computer simulation methods are under-used tools in genetic analysis because simulation approaches have been portrayed as inferior to analytic methods. Even when simulation is used, its advantages are not fully exploited. Here, I present SHIMSHON, our package of genetic simulation programs that have been developed, tested, used for research, and used to generated data for Genetic Analysis Workshops (GAW). These simulation programs, now web-accessible, can be used by anyone to answer questions about designing and analyzing genetic disease studies for locus identification. This work has three foci: (1) the historical context of SHIMSHON's development, suggesting why simulation has not been more widely used so far. (2) Advantages of simulation: computer simulation helps us to understand how genetic analysis methods work. It has advantages for understanding disease inheritance and methods for gene searches. Furthermore, simulation methods can be used to answer fundamental questions that either cannot be answered by analytical approaches or cannot even be defined until the problems are identified and studied, using simulation. (3) I argue that, because simulation was not accepted, there was a failure to grasp the meaning of some simulation-based studies of linkage. This may have contributed to perceived weaknesses in linkage analysis; weaknesses that did not, in fact, exist. PMID:22189467
NASA Astrophysics Data System (ADS)
Unni, Vineet; Sankara Narayanan, E. M.
2017-04-01
This is the first report on the numerical analysis of the performance of nanoscale vertical superjunction structures based on impurity doping and an innovative approach that utilizes the polarisation properties inherent in III-V nitride semiconductors. Such nanoscale vertical polarisation super junction structures can be realized by employing a combination of epitaxial growth along the non-polar crystallographic axes of Wurtzite GaN and nanolithography-based processing techniques. Detailed numerical simulations clearly highlight the limitations of a doping based approach and the advantages of the proposed solution for breaking the unipolar one-dimensional material limits of GaN by orders of magnitude.
Reinforce Networking Theory with OPNET Simulation
ERIC Educational Resources Information Center
Guo, Jinhua; Xiang, Weidong; Wang, Shengquan
2007-01-01
As networking systems have become more complex and expensive, hands-on experiments based on networking simulation have become essential for teaching the key computer networking topics to students. The simulation approach is the most cost effective and highly useful because it provides a virtual environment for an assortment of desirable features…
The National Shipbuilding Research Program, Computer Aided Process Planning for Shipyards
1986-08-01
Factory Simulation with Conventional Factory Planning Techniques Financial Justification of State-of-the-Art Investment: A Study Using CAPP I–5 T I T L...and engineer to order.” “Factory Simulation: Approach to Integration of Computer- Based Factory Simulation with Conventional Factory Planning Techniques
NutrientNet: An Internet-Based Approach to Teaching Market-Based Policy for Environmental Management
ERIC Educational Resources Information Center
Nguyen, To N.; Woodward, Richard T.
2009-01-01
NutrientNet is an Internet-based environment in which a class can simulate a market-based approach for improving water quality. In NutrientNet, each student receives a role as either a point source or a nonpoint source polluter, and then the participants are allowed to trade water quality credits to cost-effectively reduce pollution in a…
Wong, Ambrose H; Auerbach, Marc A; Ruppel, Halley; Crispino, Lauren J; Rosenberg, Alana; Iennaco, Joanne D; Vaca, Federico E
2018-06-01
Emergency departments (EDs) have seen harm rise for both patients and health workers from an increasing rate of agitation events. Team effectiveness during care of this population is particularly challenging because fear of physical harm leads to competing interests. Simulation is frequently employed to improve teamwork in medical resuscitations but has not yet been reported to address team-based behavioral emergency care. As part of a larger investigation of agitated patient care, we designed this secondary study to examine the impact of an interprofessional standardized patient simulation for ED agitation management. We used a mixed-methods approach with emergency medicine resident and attending physicians, Physician Assistants (PAs) and Advanced Practice Registered Nurses (APRNs), ED nurses, technicians, and security officers at two hospital sites. After a simulated agitated patient encounter, we conducted uniprofessional and interprofessional focus groups. We undertook structured thematic analysis using a grounded theory approach. Quantitative data consisted of responses to the KidSIM Questionnaire addressing teamwork and simulation-based learning attitudes before and after each session. We reached data saturation with 57 participants. KidSIM scores revealed significant improvements in attitudes toward relevance of simulation, opportunities for interprofessional education, and situation awareness, as well as four of six questions for roles/responsibilities. Two broad themes emerged from the focus groups: (1) a team-based agitated patient simulation addressed dual safety of staff and patients simultaneously and (2) the experience fostered interprofessional discovery and cooperation in agitation management. A team-based simulated agitated patient encounter highlighted the need to consider the dual safety of staff and patients while facilitating interprofessional dialog and learning. Our findings suggest that simulation may be effective to enhance teamwork in behavioral emergency care.
Flight Simulator Visual-Display Delay Compensation
NASA Technical Reports Server (NTRS)
Crane, D. Francis
1981-01-01
A piloted aircraft can be viewed as a closed-loop man-machine control system. When a simulator pilot is performing a precision maneuver, a delay in the visual display of aircraft response to pilot-control input decreases the stability of the pilot-aircraft system. The less stable system is more difficult to control precisely. Pilot dynamic response and performance change as the pilot attempts to compensate for the decrease in system stability. The changes in pilot dynamic response and performance bias the simulation results by influencing the pilot's rating of the handling qualities of the simulated aircraft. The study reported here evaluated an approach to visual-display delay compensation. The objective of the compensation was to minimize delay-induced change in pilot performance and workload, The compensation was effective. Because the compensation design approach is based on well-established control-system design principles, prospects are favorable for successful application of the approach in other simulations.
Virtual reality based surgery simulation for endoscopic gynaecology.
Székely, G; Bajka, M; Brechbühler, C; Dual, J; Enzler, R; Haller, U; Hug, J; Hutter, R; Ironmonger, N; Kauer, M; Meier, V; Niederer, P; Rhomberg, A; Schmid, P; Schweitzer, G; Thaler, M; Vuskovic, V; Tröster, G
1999-01-01
Virtual reality (VR) based surgical simulator systems offer very elegant possibilities to both enrich and enhance traditional education in endoscopic surgery. However, while a wide range of VR simulator systems have been proposed and realized in the past few years, most of these systems are far from able to provide a reasonably realistic surgical environment. We explore the basic approaches to the current limits of realism and ultimately seek to extend these based on our description and analysis of the most important components of a VR-based endoscopic simulator. The feasibility of the proposed techniques is demonstrated on a first modular prototype system implementing the basic algorithms for VR-training in gynaecologic laparoscopy.
ERIC Educational Resources Information Center
Sweet, Chelsea; Akinfenwa, Oyewumi; Foley, Jonathan J., IV
2018-01-01
We present an interactive discovery-based approach to studying the properties of real gases using simple, yet realistic, molecular dynamics software. Use of this approach opens up a variety of opportunities for students to interact with the behaviors and underlying theories of real gases. Students can visualize gas behavior under a variety of…
ERIC Educational Resources Information Center
Douglas-Lenders, Rachel Claire; Holland, Peter Jeffrey; Allen, Belinda
2017-01-01
Purpose: The purpose of this paper is to examine the impact of experiential simulation-based learning of employee self-efficacy. Design/Methodology/Approach: The research approach is an exploratory case study of a group of trainees from the same organisation. Using a quasi-experiment, one group, pre-test-post-test design (Tharenou et al., 2007), a…
ERIC Educational Resources Information Center
Douglas-Lenders, Rachel Claire; Holland, Peter Jeffrey; Allen, Belinda
2017-01-01
Purpose: The purpose of this paper is to examine the impact of experiential simulation-based learning of employee self-efficacy. Design/Methodology/Approach: The research approach is an exploratory case study of a group of trainees from the same organisation. Using a quasi-experiment, one group, pre-test-post-test design (Tharenou et al., 2007), a…
Modeling marine oily wastewater treatment by a probabilistic agent-based approach.
Jing, Liang; Chen, Bing; Zhang, Baiyu; Ye, Xudong
2018-02-01
This study developed a novel probabilistic agent-based approach for modeling of marine oily wastewater treatment processes. It begins first by constructing a probability-based agent simulation model, followed by a global sensitivity analysis and a genetic algorithm-based calibration. The proposed modeling approach was tested through a case study of the removal of naphthalene from marine oily wastewater using UV irradiation. The removal of naphthalene was described by an agent-based simulation model using 8 types of agents and 11 reactions. Each reaction was governed by a probability parameter to determine its occurrence. The modeling results showed that the root mean square errors between modeled and observed removal rates were 8.73 and 11.03% for calibration and validation runs, respectively. Reaction competition was analyzed by comparing agent-based reaction probabilities, while agents' heterogeneity was visualized by plotting their real-time spatial distribution, showing a strong potential for reactor design and process optimization. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quasi-static earthquake cycle simulation based on nonlinear viscoelastic finite element analyses
NASA Astrophysics Data System (ADS)
Agata, R.; Ichimura, T.; Hyodo, M.; Barbot, S.; Hori, T.
2017-12-01
To explain earthquake generation processes, simulation methods of earthquake cycles have been studied. For such simulations, the combination of the rate- and state-dependent friction law at the fault plane and the boundary integral method based on Green's function in an elastic half space is widely used (e.g. Hori 2009; Barbot et al. 2012). In this approach, stress change around the fault plane due to crustal deformation can be computed analytically, while the effects of complex physics such as mantle rheology and gravity are generally not taken into account. To consider such effects, we seek to develop an earthquake cycle simulation combining crustal deformation computation based on the finite element (FE) method with the rate- and state-dependent friction law. Since the drawback of this approach is the computational cost associated with obtaining numerical solutions, we adopt a recently developed fast and scalable FE solver (Ichimura et al. 2016), which assumes use of supercomputers, to solve the problem in a realistic time. As in the previous approach, we solve the governing equations consisting of the rate- and state-dependent friction law. In solving the equations, we compute stress changes along the fault plane due to crustal deformation using FE simulation, instead of computing them by superimposing slip response function as in the previous approach. In stress change computation, we take into account nonlinear viscoelastic deformation in the asthenosphere. In the presentation, we will show simulation results in a normative three-dimensional problem, where a circular-shaped velocity-weakening area is set in a square-shaped fault plane. The results with and without nonlinear viscosity in the asthenosphere will be compared. We also plan to apply the developed code to simulate the post-earthquake deformation of a megathrust earthquake, such as the 2011 Tohoku earthquake. Acknowledgment: The results were obtained using the K computer at the RIKEN (Proposal number hp160221).
NASA Technical Reports Server (NTRS)
Connor, S. A.; Wierwille, W. W.
1983-01-01
A comparison of the sensitivity and intrusion of twenty pilot workload assessment techniques was conducted using a psychomotor loading task in a three degree of freedom moving base aircraft simulator. The twenty techniques included opinion measures, spare mental capacity measures, physiological measures, eye behavior measures, and primary task performance measures. The primary task was an instrument landing system (ILS) approach and landing. All measures were recorded between the outer marker and the middle marker on the approach. Three levels (low, medium, and high) of psychomotor load were obtained by the combined manipulation of windgust disturbance level and simulated aircraft pitch stability. Six instrument rated pilots participated in four seasons lasting approximately three hours each.
Simulation of decelerating landing approaches on an externally blown flap STOL transport airplane
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Nguyen, L. T.; Deal, P. L.
1974-01-01
A fixed-base simulator program was conducted to define the problems and methods for solution associated with performing decelerating landing approaches on a representative STOL transport having a high wing and equipped with an external-flow jet flap in combination with four high-bypass-ratio fan-jet engines. Real-time digital simulation techniques were used. The computer was programed with equations of motion for six degrees of freedom and the aerodynamic inputs were based on measured wind-tunnel data. The pilot's task was to capture the localizer and the glide slope and to maintain them as closely as possible while decelerating from an initial airspeed of 140 knots to a final airspeed of 75 knots, while under IFR conditions.
A simulation-based approach for solving assembly line balancing problem
NASA Astrophysics Data System (ADS)
Wu, Xiaoyu
2017-09-01
Assembly line balancing problem is directly related to the production efficiency, since the last century, the problem of assembly line balancing was discussed and still a lot of people are studying on this topic. In this paper, the problem of assembly line is studied by establishing the mathematical model and simulation. Firstly, the model of determing the smallest production beat under certain work station number is anysized. Based on this model, the exponential smoothing approach is applied to improve the the algorithm efficiency. After the above basic work, the gas stirling engine assembly line balancing problem is discussed as a case study. Both two algorithms are implemented using the Lingo programming environment and the simulation results demonstrate the validity of the new methods.
Yousefi, M; Ferreira, R P M
2017-03-30
This study presents an agent-based simulation modeling in an emergency department. In a traditional approach, a supervisor (or a manager) allocates the resources (receptionist, nurses, doctors, etc.) to different sections based on personal experience or by using decision-support tools. In this study, each staff agent took part in the process of allocating resources based on their observation in their respective sections, which gave the system the advantage of utilizing all the available human resources during the workday by being allocated to a different section. In this simulation, unlike previous studies, all staff agents took part in the decision-making process to re-allocate the resources in the emergency department. The simulation modeled the behavior of patients, receptionists, triage nurses, emergency room nurses and doctors. Patients were able to decide whether to stay in the system or leave the department at any stage of treatment. In order to evaluate the performance of this approach, 6 different scenarios were introduced. In each scenario, various key performance indicators were investigated before and after applying the group decision-making. The outputs of each simulation were number of deaths, number of patients who leave the emergency department without being attended, length of stay, waiting time and total number of discharged patients from the emergency department. Applying the self-organizing approach in the simulation showed an average of 12.7 and 14.4% decrease in total waiting time and number of patients who left without being seen, respectively. The results showed an average increase of 11.5% in total number of discharged patients from emergency department.
Yousefi, M.; Ferreira, R.P.M.
2017-01-01
This study presents an agent-based simulation modeling in an emergency department. In a traditional approach, a supervisor (or a manager) allocates the resources (receptionist, nurses, doctors, etc.) to different sections based on personal experience or by using decision-support tools. In this study, each staff agent took part in the process of allocating resources based on their observation in their respective sections, which gave the system the advantage of utilizing all the available human resources during the workday by being allocated to a different section. In this simulation, unlike previous studies, all staff agents took part in the decision-making process to re-allocate the resources in the emergency department. The simulation modeled the behavior of patients, receptionists, triage nurses, emergency room nurses and doctors. Patients were able to decide whether to stay in the system or leave the department at any stage of treatment. In order to evaluate the performance of this approach, 6 different scenarios were introduced. In each scenario, various key performance indicators were investigated before and after applying the group decision-making. The outputs of each simulation were number of deaths, number of patients who leave the emergency department without being attended, length of stay, waiting time and total number of discharged patients from the emergency department. Applying the self-organizing approach in the simulation showed an average of 12.7 and 14.4% decrease in total waiting time and number of patients who left without being seen, respectively. The results showed an average increase of 11.5% in total number of discharged patients from emergency department. PMID:28380196
Can we approach the gas-liquid critical point using slab simulations of two coexisting phases?
Goujon, Florent; Ghoufi, Aziz; Malfreyt, Patrice; Tildesley, Dominic J
2016-09-28
In this paper, we demonstrate that it is possible to approach the gas-liquid critical point of the Lennard-Jones fluid by performing simulations in a slab geometry using a cut-off potential. In the slab simulation geometry, it is essential to apply an accurate tail correction to the potential energy, applied during the course of the simulation, to study the properties of states close to the critical point. Using the Janeček slab-based method developed for two-phase Monte Carlo simulations [J. Janec̆ek, J. Chem. Phys. 131, 6264 (2006)], the coexisting densities and surface tension in the critical region are reported as a function of the cutoff distance in the intermolecular potential. The results obtained using slab simulations are compared with those obtained using grand canonical Monte Carlo simulations of isotropic systems and the finite-size scaling techniques. There is a good agreement between these two approaches. The two-phase simulations can be used in approaching the critical point for temperatures up to 0.97 T C ∗ (T ∗ = 1.26). The critical-point exponents describing the dependence of the density, surface tension, and interfacial thickness on the temperature are calculated near the critical point.
NASA Technical Reports Server (NTRS)
Middleton, D. B.; Hurt, G. J., Jr.; Bergeron, H. P.; Patton, J. M., Jr.; Deal, P. L.; Champine, R. A.
1975-01-01
A moving-base simulator investigation of the problems of recovery and landing of a STOL aircraft after failure of an outboard engine during final approach was made. The approaches were made at 75 knots along a 6 deg glide slope. The engine was failed at low altitude and the option to go around was not allowed. The aircraft was simulated with each of three control systems, and it had four high-bypass-ratio fan-jet engines exhausting against large triple-slotted wing flaps to produce additional lift. A virtual-image out-the-window television display of a simulated STOL airport was operating during part of the investigation. Also, a simple heads-up flight director display superimposed on the airport landing scene was used by the pilots to make some of the recoveries following an engine failure. The results of the study indicated that the variation in visual cues and/or motion cues had little effect on the outcome of a recovery, but they did have some effect on the pilot's response and control patterns.
Design by Dragging: An Interface for Creative Forward and Inverse Design with Simulation Ensembles
Coffey, Dane; Lin, Chi-Lun; Erdman, Arthur G.; Keefe, Daniel F.
2014-01-01
We present an interface for exploring large design spaces as encountered in simulation-based engineering, design of visual effects, and other tasks that require tuning parameters of computationally-intensive simulations and visually evaluating results. The goal is to enable a style of design with simulations that feels as-direct-as-possible so users can concentrate on creative design tasks. The approach integrates forward design via direct manipulation of simulation inputs (e.g., geometric properties, applied forces) in the same visual space with inverse design via “tugging” and reshaping simulation outputs (e.g., scalar fields from finite element analysis (FEA) or computational fluid dynamics (CFD)). The interface includes algorithms for interpreting the intent of users’ drag operations relative to parameterized models, morphing arbitrary scalar fields output from FEA and CFD simulations, and in-place interactive ensemble visualization. The inverse design strategy can be extended to use multi-touch input in combination with an as-rigid-as-possible shape manipulation to support rich visual queries. The potential of this new design approach is confirmed via two applications: medical device engineering of a vacuum-assisted biopsy device and visual effects design using a physically based flame simulation. PMID:24051845
Aguirre-Urreta, Miguel I; Ellis, Michael E; Sun, Wenying
2012-03-01
This research investigates the performance of a proportion-based approach to meta-analytic moderator estimation through a series of Monte Carlo simulations. This approach is most useful when the moderating potential of a categorical variable has not been recognized in primary research and thus heterogeneous groups have been pooled together as a single sample. Alternative scenarios representing different distributions of group proportions are examined along with varying numbers of studies, subjects per study, and correlation combinations. Our results suggest that the approach is largely unbiased in its estimation of the magnitude of between-group differences and performs well with regard to statistical power and type I error. In particular, the average percentage bias of the estimated correlation for the reference group is positive and largely negligible, in the 0.5-1.8% range; the average percentage bias of the difference between correlations is also minimal, in the -0.1-1.2% range. Further analysis also suggests both biases decrease as the magnitude of the underlying difference increases, as the number of subjects in each simulated primary study increases, and as the number of simulated studies in each meta-analysis increases. The bias was most evident when the number of subjects and the number of studies were the smallest (80 and 36, respectively). A sensitivity analysis that examines its performance in scenarios down to 12 studies and 40 primary subjects is also included. This research is the first that thoroughly examines the adequacy of the proportion-based approach. Copyright © 2012 John Wiley & Sons, Ltd. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Caviedes-Voullième, Daniel; García-Navarro, Pilar; Murillo, Javier
2012-07-01
SummaryHydrological simulation of rain-runoff processes is often performed with lumped models which rely on calibration to generate storm hydrographs and study catchment response to rain. In this paper, a distributed, physically-based numerical model is used for runoff simulation in a mountain catchment. This approach offers two advantages. The first is that by using shallow-water equations for runoff flow, there is less freedom to calibrate routing parameters (as compared to, for example, synthetic hydrograph methods). The second, is that spatial distributions of water depth and velocity can be obtained. Furthermore, interactions among the various hydrological processes can be modeled in a physically-based approach which may depend on transient and spatially distributed factors. On the other hand, the undertaken numerical approach relies on accurate terrain representation and mesh selection, which also affects significantly the computational cost of the simulations. Hence, we investigate the response of a gauged catchment with this distributed approach. The methodology consists of analyzing the effects that the mesh has on the simulations by using a range of meshes. Next, friction is applied to the model and the response to variations and interaction with the mesh is studied. Finally, a first approach with the well-known SCS Curve Number method is studied to evaluate its behavior when coupled with a shallow-water model for runoff flow. The results show that mesh selection is of great importance, since it may affect the results in a magnitude as large as physical factors, such as friction. Furthermore, results proved to be less sensitive to roughness spatial distribution than to mesh properties. Finally, the results indicate that SCS-CN may not be suitable for simulating hydrological processes together with a shallow-water model.
NASA Astrophysics Data System (ADS)
Žukovič, Milan; Hristopulos, Dionissios T.
2009-02-01
A current problem of practical significance is how to analyze large, spatially distributed, environmental data sets. The problem is more challenging for variables that follow non-Gaussian distributions. We show by means of numerical simulations that the spatial correlations between variables can be captured by interactions between 'spins'. The spins represent multilevel discretizations of environmental variables with respect to a number of pre-defined thresholds. The spatial dependence between the 'spins' is imposed by means of short-range interactions. We present two approaches, inspired by the Ising and Potts models, that generate conditional simulations of spatially distributed variables from samples with missing data. Currently, the sampling and simulation points are assumed to be at the nodes of a regular grid. The conditional simulations of the 'spin system' are forced to respect locally the sample values and the system statistics globally. The second constraint is enforced by minimizing a cost function representing the deviation between normalized correlation energies of the simulated and the sample distributions. In the approach based on the Nc-state Potts model, each point is assigned to one of Nc classes. The interactions involve all the points simultaneously. In the Ising model approach, a sequential simulation scheme is used: the discretization at each simulation level is binomial (i.e., ± 1). Information propagates from lower to higher levels as the simulation proceeds. We compare the two approaches in terms of their ability to reproduce the target statistics (e.g., the histogram and the variogram of the sample distribution), to predict data at unsampled locations, as well as in terms of their computational complexity. The comparison is based on a non-Gaussian data set (derived from a digital elevation model of the Walker Lake area, Nevada, USA). We discuss the impact of relevant simulation parameters, such as the domain size, the number of discretization levels, and the initial conditions.
Translating building information modeling to building energy modeling using model view definition.
Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei
2014-01-01
This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.
Translating Building Information Modeling to Building Energy Modeling Using Model View Definition
Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.
2014-01-01
This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954
Finite-element approach to Brownian dynamics of polymers.
Cyron, Christian J; Wall, Wolfgang A
2009-12-01
In the last decades simulation tools for Brownian dynamics of polymers have attracted more and more interest. Such simulation tools have been applied to a large variety of problems and accelerated the scientific progress significantly. However, the currently most frequently used explicit bead models exhibit severe limitations, especially with respect to time step size, the necessity of artificial constraints and the lack of a sound mathematical foundation. Here we present a framework for simulations of Brownian polymer dynamics based on the finite-element method. This approach allows simulating a wide range of physical phenomena at a highly attractive computational cost on the basis of a far-developed mathematical background.
Pilling, Valerie K; Brannon, Laura A
2007-01-01
Health communication appeals were utilized through a Web site simulation to evaluate the potential effectiveness of 3 intervention approaches to promote responsible drinking among college students. Within the Web site simulation, participants were exposed to a persuasive message designed to represent either the generalized social norms advertising approach (based on others' behavior), the personalized behavioral feedback approach (tailored to the individual's behavior), or the schema-based approach (tailored to the individual's self-schema, or personality). A control group was exposed to a message that was designed to be neutral (it was designed to discourage heavy drinking, but it did not represent any of the previously mentioned approaches). It was hypothesized that the more personalized the message was to the individual, the more favorable college students' attitudes would be toward the responsible drinking message. Participants receiving the more personalized messages did report more favorable attitudes toward the responsible drinking message.
NASA Astrophysics Data System (ADS)
Mohammed, K.; Islam, A. S.; Khan, M. J. U.; Das, M. K.
2017-12-01
With the large number of hydrologic models presently available along with the global weather and geographic datasets, streamflows of almost any river in the world can be easily modeled. And if a reasonable amount of observed data from that river is available, then simulations of high accuracy can sometimes be performed after calibrating the model parameters against those observed data through inverse modeling. Although such calibrated models can succeed in simulating the general trend or mean of the observed flows very well, more often than not they fail to adequately simulate the extreme flows. This causes difficulty in tasks such as generating reliable projections of future changes in extreme flows due to climate change, which is obviously an important task due to floods and droughts being closely connected to people's lives and livelihoods. We propose an approach where the outputs of a physically-based hydrologic model are used as an input to a machine learning model to try and better simulate the extreme flows. To demonstrate this offline-coupling approach, the Soil and Water Assessment Tool (SWAT) was selected as the physically-based hydrologic model, the Artificial Neural Network (ANN) as the machine learning model and the Ganges-Brahmaputra-Meghna (GBM) river system as the study area. The GBM river system, located in South Asia, is the third largest in the world in terms of freshwater generated and forms the largest delta in the world. The flows of the GBM rivers were simulated separately in order to test the performance of this proposed approach in accurately simulating the extreme flows generated by different basins that vary in size, climate, hydrology and anthropogenic intervention on stream networks. Results show that by post-processing the simulated flows of the SWAT models with ANN models, simulations of extreme flows can be significantly improved. The mean absolute errors in simulating annual maximum/minimum daily flows were minimized from 4967 cusecs to 1294 cusecs for Ganges, from 5695 cusecs to 2115 cusecs for Brahmaputra and from 689 cusecs to 321 cusecs for Meghna. Using this approach, simulations of hydrologic variables other than streamflow can also be improved given that a decent amount of observed data for that variable is available.
Assessment methodology for computer-based instructional simulations.
Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J
2013-10-01
Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Probabilistic load simulation: Code development status
NASA Astrophysics Data System (ADS)
Newell, J. F.; Ho, H.
1991-05-01
The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.
NASA Technical Reports Server (NTRS)
Cardullo, Frank M.; Lewis, Harold W., III; Panfilov, Peter B.
2007-01-01
An extremely innovative approach has been presented, which is to have the surgeon operate through a simulator running in real-time enhanced with an intelligent controller component to enhance the safety and efficiency of a remotely conducted operation. The use of a simulator enables the surgeon to operate in a virtual environment free from the impediments of telecommunication delay. The simulator functions as a predictor and periodically the simulator state is corrected with truth data. Three major research areas must be explored in order to ensure achieving the objectives. They are: simulator as predictor, image processing, and intelligent control. Each is equally necessary for success of the project and each of these involves a significant intelligent component in it. These are diverse, interdisciplinary areas of investigation, thereby requiring a highly coordinated effort by all the members of our team, to ensure an integrated system. The following is a brief discussion of those areas. Simulator as a predictor: The delays encountered in remote robotic surgery will be greater than any encountered in human-machine systems analysis, with the possible exception of remote operations in space. Therefore, novel compensation techniques will be developed. Included will be the development of the real-time simulator, which is at the heart of our approach. The simulator will present real-time, stereoscopic images and artificial haptic stimuli to the surgeon. Image processing: Because of the delay and the possibility of insufficient bandwidth a high level of novel image processing is necessary. This image processing will include several innovative aspects, including image interpretation, video to graphical conversion, texture extraction, geometric processing, image compression and image generation at the surgeon station. Intelligent control: Since the approach we propose is in a sense predictor based, albeit a very sophisticated predictor, a controller, which not only optimizes end effector trajectory but also avoids error, is essential. We propose to investigate two different approaches to the controller design. One approach employs an optimal controller based on modern control theory; the other one involves soft computing techniques, i.e. fuzzy logic, neural networks, genetic algorithms and hybrids of these.
A Survey of Model Evaluation Approaches with a Tutorial on Hierarchical Bayesian Methods
ERIC Educational Resources Information Center
Shiffrin, Richard M.; Lee, Michael D.; Kim, Woojae; Wagenmakers, Eric-Jan
2008-01-01
This article reviews current methods for evaluating models in the cognitive sciences, including theoretically based approaches, such as Bayes factors and minimum description length measures; simulation approaches, including model mimicry evaluations; and practical approaches, such as validation and generalization measures. This article argues…
Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators
NASA Astrophysics Data System (ADS)
Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis
Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.
Mixed-field GCR Simulations for Radiobiological Research Using Ground Based Accelerators
NASA Technical Reports Server (NTRS)
Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis A.
2014-01-01
Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20% accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.
Some Psychometric and Design Implications of Game-Based Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; Clarke-Midura, Jody
2013-01-01
The rise of digital game and simulation-based learning applications has led to new approaches in educational measurement that take account of patterns in time, high resolution paths of action, and clusters of virtual performance artifacts. The new approaches, which depart from traditional statistical analyses, include data mining, machine…
Quasi-classical approaches to vibronic spectra revisited
NASA Astrophysics Data System (ADS)
Karsten, Sven; Ivanov, Sergei D.; Bokarev, Sergey I.; Kühn, Oliver
2018-03-01
The framework to approach quasi-classical dynamics in the electronic ground state is well established and is based on the Kubo-transformed time correlation function (TCF), being the most classical-like quantum TCF. Here we discuss whether the choice of the Kubo-transformed TCF as a starting point for simulating vibronic spectra is as unambiguous as it is for vibrational ones. Employing imaginary-time path integral techniques in combination with the interaction representation allowed us to formulate a method for simulating vibronic spectra in the adiabatic regime that takes nuclear quantum effects and dynamics on multiple potential energy surfaces into account. Further, a generalized quantum TCF is proposed that contains many well-established TCFs, including the Kubo one, as particular cases. Importantly, it also provides a framework to construct new quantum TCFs. Applying the developed methodology to the generalized TCF leads to a plethora of simulation protocols, which are based on the well-known TCFs as well as on new ones. Their performance is investigated on 1D anharmonic model systems at finite temperatures. It is shown that the protocols based on the new TCFs may lead to superior results with respect to those based on the common ones. The strategies to find the optimal approach are discussed.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
Krüger, Dennis M; Ahmed, Aqeel; Gohlke, Holger
2012-07-01
The NMSim web server implements a three-step approach for multiscale modeling of protein conformational changes. First, the protein structure is coarse-grained using the FIRST software. Second, a rigid cluster normal-mode analysis provides low-frequency normal modes. Third, these modes are used to extend the recently introduced idea of constrained geometric simulations by biasing backbone motions of the protein, whereas side chain motions are biased toward favorable rotamer states (NMSim). The generated structures are iteratively corrected regarding steric clashes and stereochemical constraint violations. The approach allows performing three simulation types: unbiased exploration of conformational space; pathway generation by a targeted simulation; and radius of gyration-guided simulation. On a data set of proteins with experimentally observed conformational changes, the NMSim approach has been shown to be a computationally efficient alternative to molecular dynamics simulations for conformational sampling of proteins. The generated conformations and pathways of conformational transitions can serve as input to docking approaches or more sophisticated sampling techniques. The web server output is a trajectory of generated conformations, Jmol representations of the coarse-graining and a subset of the trajectory and data plots of structural analyses. The NMSim webserver, accessible at http://www.nmsim.de, is free and open to all users with no login requirement.
New approaches in agent-based modeling of complex financial systems
NASA Astrophysics Data System (ADS)
Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei
2017-12-01
Agent-based modeling is a powerful simulation technique to understand the collective behavior and microscopic interaction in complex financial systems. Recently, the concept for determining the key parameters of agent-based models from empirical data instead of setting them artificially was suggested. We first review several agent-based models and the new approaches to determine the key model parameters from historical market data. Based on the agents' behaviors with heterogeneous personal preferences and interactions, these models are successful in explaining the microscopic origination of the temporal and spatial correlations of financial markets. We then present a novel paradigm combining big-data analysis with agent-based modeling. Specifically, from internet query and stock market data, we extract the information driving forces and develop an agent-based model to simulate the dynamic behaviors of complex financial systems.
Comparative study on gene set and pathway topology-based enrichment methods.
Bayerlová, Michaela; Jung, Klaus; Kramer, Frank; Klemm, Florian; Bleckmann, Annalen; Beißbarth, Tim
2015-10-22
Enrichment analysis is a popular approach to identify pathways or sets of genes which are significantly enriched in the context of differentially expressed genes. The traditional gene set enrichment approach considers a pathway as a simple gene list disregarding any knowledge of gene or protein interactions. In contrast, the new group of so called pathway topology-based methods integrates the topological structure of a pathway into the analysis. We comparatively investigated gene set and pathway topology-based enrichment approaches, considering three gene set and four topological methods. These methods were compared in two extensive simulation studies and on a benchmark of 36 real datasets, providing the same pathway input data for all methods. In the benchmark data analysis both types of methods showed a comparable ability to detect enriched pathways. The first simulation study was conducted with KEGG pathways, which showed considerable gene overlaps between each other. In this study with original KEGG pathways, none of the topology-based methods outperformed the gene set approach. Therefore, a second simulation study was performed on non-overlapping pathways created by unique gene IDs. Here, methods accounting for pathway topology reached higher accuracy than the gene set methods, however their sensitivity was lower. We conducted one of the first comprehensive comparative works on evaluating gene set against pathway topology-based enrichment methods. The topological methods showed better performance in the simulation scenarios with non-overlapping pathways, however, they were not conclusively better in the other scenarios. This suggests that simple gene set approach might be sufficient to detect an enriched pathway under realistic circumstances. Nevertheless, more extensive studies and further benchmark data are needed to systematically evaluate these methods and to assess what gain and cost pathway topology information introduces into enrichment analysis. Both types of methods for enrichment analysis require further improvements in order to deal with the problem of pathway overlaps.
NASA Astrophysics Data System (ADS)
Diehl, Martin; Groeber, Michael; Haase, Christian; Molodov, Dmitri A.; Roters, Franz; Raabe, Dierk
2017-05-01
Predicting, understanding, and controlling the mechanical behavior is the most important task when designing structural materials. Modern alloy systems—in which multiple deformation mechanisms, phases, and defects are introduced to overcome the inverse strength-ductility relationship—give raise to multiple possibilities for modifying the deformation behavior, rendering traditional, exclusively experimentally-based alloy development workflows inappropriate. For fast and efficient alloy design, it is therefore desirable to predict the mechanical performance of candidate alloys by simulation studies to replace time- and resource-consuming mechanical tests. Simulation tools suitable for this task need to correctly predict the mechanical behavior in dependence of alloy composition, microstructure, texture, phase fractions, and processing history. Here, an integrated computational materials engineering approach based on the open source software packages DREAM.3D and DAMASK (Düsseldorf Advanced Materials Simulation Kit) that enables such virtual material development is presented. More specific, our approach consists of the following three steps: (1) acquire statistical quantities that describe a microstructure, (2) build a representative volume element based on these quantities employing DREAM.3D, and (3) evaluate the representative volume using a predictive crystal plasticity material model provided by DAMASK. Exemplarily, these steps are here conducted for a high-manganese steel.
ERIC Educational Resources Information Center
Keskitalo, Tuulikki
2011-01-01
This research article focuses on virtual reality (VR) and simulation-based training, with a special focus on the pedagogical use of the Virtual Centre of Wellness Campus known as ENVI (Rovaniemi, Finland). In order to clearly understand how teachers perceive teaching and learning in such environments, this research examines the concepts of…
An Evaluation of Gender Differences in Computer-Based Case Simulations.
ERIC Educational Resources Information Center
Scheuneman, Janice Dowd; And Others
As part of the research leading to the implementation of computer-based case simulations (CCS) for the licensing examinations of the National Board of Medical Examiners, gender differences in performance were studied for one form consisting of 18 cases. A secondary purpose of the study was to note differences in style or approach that might…
Evaluating crown fire rate of spread predictions from physics-based models
C. M. Hoffman; J. Ziegler; J. Canfield; R. R. Linn; W. Mell; C. H. Sieg; F. Pimont
2015-01-01
Modeling the behavior of crown fires is challenging due to the complex set of coupled processes that drive the characteristics of a spreading wildfire and the large range of spatial and temporal scales over which these processes occur. Detailed physics-based modeling approaches such as FIRETEC and the Wildland Urban Interface Fire Dynamics Simulator (WFDS) simulate...
Research-Based Design and Development of a Simulation of Liquid-Vapor Equilibrium
ERIC Educational Resources Information Center
Akaygun, Sevil; Jones, Loretta L.
2013-01-01
Helping learners to visualize the structures and dynamics of particles through the use of technology is challenging. Animations and simulations can be difficult for learners to interpret and can even lead to new misconceptions. A systematic approach to development based on the findings of cognitive science was used to design, develop, and evaluate…
Eco-Logic: Logic-Based Approaches to Ecological Modelling
Daniel L. Schmoldt
1991-01-01
This paper summarizes the simulation research carried out during 1984-1989 at the University of Edinburgh. Two primary objectives of their research are 1) to provide tools for manipulating simulation models (i.e., implementation tools) and 2) to provide advice on conceptualizing real-world phenomena into an idealized representation for simulation (i.e., model design...
ERIC Educational Resources Information Center
Kapralos, Bill; Hogan, Michelle; Pribetic, Antonin I.; Dubrowski, Adam
2011-01-01
Purpose: Gaming and interactive virtual simulation environments support a learner-centered educational model allowing learners to work through problems acquiring knowledge through an active, experiential learning approach. To develop effective virtual simulations and serious games, the views and perceptions of learners and educators must be…
ERIC Educational Resources Information Center
Romeu, Jorge Luis
2008-01-01
This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…
Gordon, James A
2012-01-01
Technology-enhanced patient simulation has emerged as an important new modality for teaching and learning in medicine. In particular, immersive simulation platforms that replicate the clinical environment promise to revolutionize medical education by enabling an enhanced level of safety, standardization, and efficiency across health-care training. Such an experiential approach seems unique in reliably catalyzing a level of emotional engagement that fosters immediate and indelible learning and allows for increasingly reliable levels of performance evaluation-all in a completely risk-free environment. As such, medical simulation is poised to emerge as a critical component of training and certification throughout health care, promising to fundamentally enhance quality and safety across disciplines. To encourage routine simulation-based practice as part of its core quality and safety mission, Massachusetts General Hospital now incorporates simulation resources within its historic medical library (est. 1847), located at the center of the campus. In this new model, learners go to the library not only to read about a patient's illness, but also to take care of their "patient." Such an approach redefines and advances the central role of the library on the campus and ensures that simulation-based practice is centrally available as part of everyday hospital operations. This article describes the reasons for identifying simulation as an institutional priority leading up to the Massachusetts General Hospital Bicentennial Celebration (1811-2011) and for creating a simulation-based learning laboratory within a hospital library.
Translating the simulation of procedural drilling techniques for interactive neurosurgical training.
Stredney, Don; Rezai, Ali R; Prevedello, Daniel M; Elder, J Bradley; Kerwin, Thomas; Hittle, Bradley; Wiet, Gregory J
2013-10-01
Through previous efforts we have developed a fully virtual environment to provide procedural training of otologic surgical technique. The virtual environment is based on high-resolution volumetric data of the regional anatomy. These volumetric data help drive an interactive multisensory, ie, visual (stereo), aural (stereo), and tactile, simulation environment. Subsequently, we have extended our efforts to support the training of neurosurgical procedural technique as part of the Congress of Neurological Surgeons simulation initiative. To deliberately study the integration of simulation technologies into the neurosurgical curriculum and to determine their efficacy in teaching minimally invasive cranial and skull base approaches. We discuss issues of biofidelity and our methods to provide objective, quantitative and automated assessment for the residents. We conclude with a discussion of our experiences by reporting preliminary formative pilot studies and proposed approaches to take the simulation to the next level through additional validation studies. We have presented our efforts to translate an otologic simulation environment for use in the neurosurgical curriculum. We have demonstrated the initial proof of principles and define the steps to integrate and validate the system as an adjuvant to the neurosurgical curriculum.
Wavelet-based surrogate time series for multiscale simulation of heterogeneous catalysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savara, Aditya Ashi; Daw, C. Stuart; Xiong, Qingang
We propose a wavelet-based scheme that encodes the essential dynamics of discrete microscale surface reactions in a form that can be coupled with continuum macroscale flow simulations with high computational efficiency. This makes it possible to simulate the dynamic behavior of reactor-scale heterogeneous catalysis without requiring detailed concurrent simulations at both the surface and continuum scales using different models. Our scheme is based on the application of wavelet-based surrogate time series that encodes the essential temporal and/or spatial fine-scale dynamics at the catalyst surface. The encoded dynamics are then used to generate statistically equivalent, randomized surrogate time series, which canmore » be linked to the continuum scale simulation. As a result, we illustrate an application of this approach using two different kinetic Monte Carlo simulations with different characteristic behaviors typical for heterogeneous chemical reactions.« less
Baumketner, Andrij
2009-01-01
The performance of reaction-field methods to treat electrostatic interactions is tested in simulations of ions solvated in water. The potential of mean force between sodium chloride pair of ions and between side chains of lysine and aspartate are computed using umbrella sampling and molecular dynamics simulations. It is found that in comparison with lattice sum calculations, the charge-group-based approaches to reaction-field treatments produce a large error in the association energy of the ions that exhibits strong systematic dependence on the size of the simulation box. The atom-based implementation of the reaction field is seen to (i) improve the overall quality of the potential of mean force and (ii) remove the dependence on the size of the simulation box. It is suggested that the atom-based truncation be used in reaction-field simulations of mixed media. PMID:19292522
Wavelet-based surrogate time series for multiscale simulation of heterogeneous catalysis
Savara, Aditya Ashi; Daw, C. Stuart; Xiong, Qingang; ...
2016-01-28
We propose a wavelet-based scheme that encodes the essential dynamics of discrete microscale surface reactions in a form that can be coupled with continuum macroscale flow simulations with high computational efficiency. This makes it possible to simulate the dynamic behavior of reactor-scale heterogeneous catalysis without requiring detailed concurrent simulations at both the surface and continuum scales using different models. Our scheme is based on the application of wavelet-based surrogate time series that encodes the essential temporal and/or spatial fine-scale dynamics at the catalyst surface. The encoded dynamics are then used to generate statistically equivalent, randomized surrogate time series, which canmore » be linked to the continuum scale simulation. As a result, we illustrate an application of this approach using two different kinetic Monte Carlo simulations with different characteristic behaviors typical for heterogeneous chemical reactions.« less
Kroshl, William M; Sarkani, Shahram; Mazzuchi, Thomas A
2015-09-01
This article presents ongoing research that focuses on efficient allocation of defense resources to minimize the damage inflicted on a spatially distributed physical network such as a pipeline, water system, or power distribution system from an attack by an active adversary, recognizing the fundamental difference between preparing for natural disasters such as hurricanes, earthquakes, or even accidental systems failures and the problem of allocating resources to defend against an opponent who is aware of, and anticipating, the defender's efforts to mitigate the threat. Our approach is to utilize a combination of integer programming and agent-based modeling to allocate the defensive resources. We conceptualize the problem as a Stackelberg "leader follower" game where the defender first places his assets to defend key areas of the network, and the attacker then seeks to inflict the maximum damage possible within the constraints of resources and network structure. The criticality of arcs in the network is estimated by a deterministic network interdiction formulation, which then informs an evolutionary agent-based simulation. The evolutionary agent-based simulation is used to determine the allocation of resources for attackers and defenders that results in evolutionary stable strategies, where actions by either side alone cannot increase its share of victories. We demonstrate these techniques on an example network, comparing the evolutionary agent-based results to a more traditional, probabilistic risk analysis (PRA) approach. Our results show that the agent-based approach results in a greater percentage of defender victories than does the PRA-based approach. © 2015 Society for Risk Analysis.
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, J.; Polly, B.; Collis, J.
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon
2013-09-01
This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less
The impact of peer-led simulations on student nurses.
Valler-Jones, Tracey
Simulation within nurse education has been widely accepted as an educational approach. However, this is mainly led by the facilitator with the student maintaining a passive role in the learning. This paper describes a study that was undertaken to analyse the effectiveness of peer-led simulations in the undergraduate nursing programme. A mixed-method approach was used for this study design. This study took place in a simulation suite within a university in the Midlands. Twenty four second-year child branch students were purposively selected to take part. Students designed and facilitated a simulation based on the care of a critically ill child. Formal assessment of the learning was collected via the use of a structured clinical examination. Students completed an evaluation of their perceived confidence and competence levels. There was 100% pass rate in the assessment of students' clinical competence following the simulation. Thematic analysis of the evaluation highlighted the learning achieved by the students, not only of their clinical skills but also their personal development. The use of peer-led simulation promotes new learning and is a valuable educational approach.
A shell approach for fibrous reinforcement forming simulations
NASA Astrophysics Data System (ADS)
Liang, B.; Colmars, J.; Boisse, P.
2018-05-01
Because of the slippage between fibers, the basic assumptions of classical plate and shell theories are not verified by fiber reinforcement during a forming. However, simulations of reinforcement forming use shell finite elements when wrinkles development is important. A shell formulation is proposed for the forming simulations of continuous fiber reinforcements. The large tensile stiffness leads to the quasi inextensibility in the fiber directions. The fiber bending stiffness determines the curvature of the reinforcement. The calculation of tensile and bending virtual works are based on the precise geometry of the single fiber. Simulations and experiments are compared for different reinforcements. It is shown that the proposed fibrous shell approach not only correctly simulates the deflections but also the rotations of the through thickness material normals.
NASA Astrophysics Data System (ADS)
Dong, Feifei; Liu, Yong; Wu, Zhen; Chen, Yihui; Guo, Huaicheng
2018-07-01
Targeting nonpoint source (NPS) pollution hot spots is of vital importance for placement of best management practices (BMPs). Although physically-based watershed models have been widely used to estimate nutrient emissions, connections between nutrient abatement and compliance of water quality standards have been rarely considered in NPS hotspot ranking, which may lead to ineffective decision-making. It's critical to develop a strategy to identify priority management areas (PMAs) based on water quality response to nutrient load mitigation. A water quality constrained PMA identification framework was thereby proposed in this study, based on the simulation-optimization approach with ideal load reduction (ILR-SO). It integrates the physically-based Soil and Water Assessment Tool (SWAT) model and an optimization model under constraints of site-specific water quality standards. To our knowledge, it was the first effort to identify PMAs with simulation-based optimization. The SWAT model was established to simulate temporal and spatial nutrient loading and evaluate effectiveness of pollution mitigation. A metamodel was trained to establish a quantitative relationship between sources and water quality. Ranking of priority areas is based on required nutrient load reduction in each sub-watershed targeting to satisfy water quality standards in waterbodies, which was calculated with genetic algorithm (GA). The proposed approach was used for identification of PMAs on the basis of diffuse total phosphorus (TP) in Lake Dianchi Watershed, one of the three most eutrophic large lakes in China. The modeling results demonstrated that 85% of diffuse TP came from 30% of the watershed area. Compared with the two conventional targeting strategies based on overland nutrient loss and instream nutrient loading, the ILR-SO model identified distinct PMAs and narrowed down the coverage of management areas. This study addressed the urgent need to incorporate water quality response into PMA identification and showed that the ILR-SO approach is effective to guide watershed management for aquatic ecosystem restoration.
Predictive Models for Semiconductor Device Design and Processing
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya; Arnold, James O. (Technical Monitor)
1998-01-01
The device feature size continues to be on a downward trend with a simultaneous upward trend in wafer size to 300 mm. Predictive models are needed more than ever before for this reason. At NASA Ames, a Device and Process Modeling effort has been initiated recently with a view to address these issues. Our activities cover sub-micron device physics, process and equipment modeling, computational chemistry and material science. This talk would outline these efforts and emphasize the interaction among various components. The device physics component is largely based on integrating quantum effects into device simulators. We have two parallel efforts, one based on a quantum mechanics approach and the second, a semiclassical hydrodynamics approach with quantum correction terms. Under the first approach, three different quantum simulators are being developed and compared: a nonequlibrium Green's function (NEGF) approach, Wigner function approach, and a density matrix approach. In this talk, results using various codes will be presented. Our process modeling work focuses primarily on epitaxy and etching using first-principles models coupling reactor level and wafer level features. For the latter, we are using a novel approach based on Level Set theory. Sample results from this effort will also be presented.
James E. Smith; Coeli M. Hoover
2017-01-01
The carbon reports in the Fire and Fuels Extension (FFE) to the Forest Vegetation Simulator (FVS) provide two alternate approaches to carbon estimates for live trees (Rebain 2010). These are (1) the FFE biomass algorithms, which are volumebased biomass equations, and (2) the Jenkins allometric equations (Jenkins and others 2003), which are diameter based. Here, we...
Dynamic analysis of flexible mechanical systems using LATDYN
NASA Technical Reports Server (NTRS)
Wu, Shih-Chin; Chang, Che-Wei; Housner, Jerrold M.
1989-01-01
A 3-D, finite element based simulation tool for flexible multibody systems is presented. Hinge degrees-of-freedom is built into equations of motion to reduce geometric constraints. The approach avoids the difficulty in selecting deformation modes for flexible components by using assumed mode method. The tool is applied to simulate a practical space structure deployment problem. Results of examples demonstrate the capability of the code and approach.
NASA Technical Reports Server (NTRS)
Pavel, M.
1993-01-01
This presentation outlines in viewgraph format a general approach to the evaluation of display system quality for aviation applications. This approach is based on the assumption that it is possible to develop a model of the display which captures most of the significant properties of the display. The display characteristics should include spatial and temporal resolution, intensity quantizing effects, spatial sampling, delays, etc. The model must be sufficiently well specified to permit generation of stimuli that simulate the output of the display system. The first step in the evaluation of display quality is an analysis of the tasks to be performed using the display. Thus, for example, if a display is used by a pilot during a final approach, the aesthetic aspects of the display may be less relevant than its dynamic characteristics. The opposite task requirements may apply to imaging systems used for displaying navigation charts. Thus, display quality is defined with regard to one or more tasks. Given a set of relevant tasks, there are many ways to approach display evaluation. The range of evaluation approaches includes visual inspection, rapid evaluation, part-task simulation, and full mission simulation. The work described is focused on two complementary approaches to rapid evaluation. The first approach is based on a model of the human visual system. A model of the human visual system is used to predict the performance of the selected tasks. The model-based evaluation approach permits very rapid and inexpensive evaluation of various design decisions. The second rapid evaluation approach employs specifically designed critical tests that embody many important characteristics of actual tasks. These are used in situations where a validated model is not available. These rapid evaluation tests are being implemented in a workstation environment.
NASA Astrophysics Data System (ADS)
Krishnanathan, Kirubhakaran; Anderson, Sean R.; Billings, Stephen A.; Kadirkamanathan, Visakan
2016-11-01
In this paper, we derive a system identification framework for continuous-time nonlinear systems, for the first time using a simulation-focused computational Bayesian approach. Simulation approaches to nonlinear system identification have been shown to outperform regression methods under certain conditions, such as non-persistently exciting inputs and fast-sampling. We use the approximate Bayesian computation (ABC) algorithm to perform simulation-based inference of model parameters. The framework has the following main advantages: (1) parameter distributions are intrinsically generated, giving the user a clear description of uncertainty, (2) the simulation approach avoids the difficult problem of estimating signal derivatives as is common with other continuous-time methods, and (3) as noted above, the simulation approach improves identification under conditions of non-persistently exciting inputs and fast-sampling. Term selection is performed by judging parameter significance using parameter distributions that are intrinsically generated as part of the ABC procedure. The results from a numerical example demonstrate that the method performs well in noisy scenarios, especially in comparison to competing techniques that rely on signal derivative estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kucharik, M.; Scovazzi, Guglielmo; Shashkov, Mikhail Jurievich
Hourglassing is a well-known pathological numerical artifact affecting the robustness and accuracy of Lagrangian methods. There exist a large number of hourglass control/suppression strategies. In the community of the staggered compatible Lagrangian methods, the approach of sub-zonal pressure forces is among the most widely used. However, this approach is known to add numerical strength to the solution, which can cause potential problems in certain types of simulations, for instance in simulations of various instabilities. To avoid this complication, we have adapted the multi-scale residual-based stabilization typically used in the finite element approach for staggered compatible framework. In this study, wemore » describe two discretizations of the new approach and demonstrate their properties and compare with the method of sub-zonal pressure forces on selected numerical problems.« less
Kucharik, M.; Scovazzi, Guglielmo; Shashkov, Mikhail Jurievich; ...
2017-10-28
Hourglassing is a well-known pathological numerical artifact affecting the robustness and accuracy of Lagrangian methods. There exist a large number of hourglass control/suppression strategies. In the community of the staggered compatible Lagrangian methods, the approach of sub-zonal pressure forces is among the most widely used. However, this approach is known to add numerical strength to the solution, which can cause potential problems in certain types of simulations, for instance in simulations of various instabilities. To avoid this complication, we have adapted the multi-scale residual-based stabilization typically used in the finite element approach for staggered compatible framework. In this study, wemore » describe two discretizations of the new approach and demonstrate their properties and compare with the method of sub-zonal pressure forces on selected numerical problems.« less
The effects of hillslope-scale variability in burn severity on post-fire sediment delivery
NASA Astrophysics Data System (ADS)
Quinn, Dylan; Brooks, Erin; Dobre, Mariana; Lew, Roger; Robichaud, Peter; Elliot, William
2017-04-01
With the increasing frequency of wildfire and the costs associated with managing the burned landscapes, there is an increasing need for decision support tools that can be used to assess the effectiveness of targeted post-fire management strategies. The susceptibility of landscapes to post-fire soil erosion and runoff have been closely linked with the severity of the wildfire. Wildfire severity maps are often spatial complex and largely dependent upon total vegetative biomass, fuel moisture patterns, direction of burn, wind patterns, and other factors. The decision to apply targeted treatment to a specific landscape and the amount of resources dedicated to treating a landscape should ideally be based on the potential for excessive sediment delivery from a particular hillslope. Recent work has suggested that the delivery of sediment to a downstream water body from a hillslope will be highly influenced by the distribution of wildfire severity across a hillslope and that models that do not capture this hillslope scale variability would not provide reliable sediment and runoff predictions. In this project we compare detailed (10 m) grid-based model predictions to lumped and semi-lumped hillslope approaches where hydrologic parameters are fixed based on hillslope scale averaging techniques. We use the watershed scale version of the process-based Watershed Erosion Prediction Projection (WEPP) model and its GIS interface, GeoWEPP, to simulate the fire impacts on runoff and sediment delivery using burn severity maps at a watershed scale. The flowpath option in WEPP allows for the most detail representation of wildfire severity patterns (10 m) but depending upon the size of the watershed, simulations are time consuming and computational demanding. The hillslope version is a simpler approach which assigns wildfire severity based on the severity level that is assigned to the majority of the hillslope area. In the third approach we divided hillslopes in overland flow elements (OFEs) and assigned representative input values on a finer scale within single hillslopes. Each of these approaches were compared for several large wildfires in the mountainous ranges of central Idaho, USA. Simulations indicated that predictions based on lumped hillslope modeling over-predict sediment transport by as much as 4.8x in areas of high to moderate burn severity. Annual sediment yield within the simulated watersheds ranged from 1.7 tonnes/ha to 6.8 tonnes/ha. The disparity between simulated sediment yield with these approaches was attributed to hydrologic connectivity of the burn patterns within the hillslope. High infiltration rates between high severity sites can greatly reduce the delivery of sediment. This research underlines the importance of accurately representing soil burn severity along individual hillslopes in hydrologic models and the need for modeling approaches to capture this variability to reliability simulate soil erosion.
Studying distributed cognition of simulation-based team training with DiCoT.
Rybing, Jonas; Nilsson, Heléne; Jonson, Carl-Oscar; Bang, Magnus
2016-03-01
Health care organizations employ simulation-based team training (SBTT) to improve skill, communication and coordination in a broad range of critical care contexts. Quantitative approaches, such as team performance measurements, are predominantly used to measure SBTTs effectiveness. However, a practical evaluation method that examines how this approach supports cognition and teamwork is missing. We have applied Distributed Cognition for Teamwork (DiCoT), a method for analysing cognition and collaboration aspects of work settings, with the purpose of assessing the methodology's usefulness for evaluating SBTTs. In a case study, we observed and analysed four Emergo Train System® simulation exercises where medical professionals trained emergency response routines. The study suggests that DiCoT is an applicable and learnable tool for determining key distributed cognition attributes of SBTTs that are of importance for the simulation validity of training environments. Moreover, we discuss and exemplify how DiCoT supports design of SBTTs with a focus on transfer and validity characteristics. Practitioner Summary: In this study, we have evaluated a method to assess simulation-based team training environments from a cognitive ergonomics perspective. Using a case study, we analysed Distributed Cognition for Teamwork (DiCoT) by applying it to the Emergo Train System®. We conclude that DiCoT is useful for SBTT evaluation and simulator (re)design.
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include methods that 1) explicitly model the three-dimensional geometry of pore spaces and 2) those that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of class 1, based on direct numerical simulation using computational fluid dynamics (CFD) codes, against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of class 1 based on the immersed-boundary method (IMB),more » lattice Boltzmann method (LBM), smoothed particle hydrodynamics (SPH), as well as a model of class 2 (a pore-network model or PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and nonreactive solute transport, and intercompare the model results with previously reported experimental observations. Experimental observations are limited to measured pore-scale velocities, so solute transport comparisons are made only among the various models. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations).« less
ERIC Educational Resources Information Center
Figari, Francesco; Iacovou, Maria; Skew, Alexandra J.; Sutherland, Holly
2012-01-01
In this paper, we evaluate income distributions in four European countries (Austria, Italy, Spain and Hungary) using two complementary approaches: a standard approach based on reported incomes in survey data, and a microsimulation approach, where taxes and benefits are simulated. These two approaches may be expected to generate slightly different…
Yakimov, Eugene B
2016-06-01
An approach for a prediction of (63)Ni-based betavoltaic battery output parameters is described. It consists of multilayer Monte Carlo simulation to obtain the depth dependence of excess carrier generation rate inside the semiconductor converter, a determination of collection probability based on the electron beam induced current measurements, a calculation of current induced in the semiconductor converter by beta-radiation, and SEM measurements of output parameters using the calculated induced current value. Such approach allows to predict the betavoltaic battery parameters and optimize the converter design for any real semiconductor structure and any thickness and specific activity of beta-radiation source. Copyright © 2016 Elsevier Ltd. All rights reserved.
Stochastic Earthquake Rupture Modeling Using Nonparametric Co-Regionalization
NASA Astrophysics Data System (ADS)
Lee, Kyungbook; Song, Seok Goo
2017-09-01
Accurate predictions of the intensity and variability of ground motions are essential in simulation-based seismic hazard assessment. Advanced simulation-based ground motion prediction methods have been proposed to complement the empirical approach, which suffers from the lack of observed ground motion data, especially in the near-source region for large events. It is important to quantify the variability of the earthquake rupture process for future events and to produce a number of rupture scenario models to capture the variability in simulation-based ground motion predictions. In this study, we improved the previously developed stochastic earthquake rupture modeling method by applying the nonparametric co-regionalization, which was proposed in geostatistics, to the correlation models estimated from dynamically derived earthquake rupture models. The nonparametric approach adopted in this study is computationally efficient and, therefore, enables us to simulate numerous rupture scenarios, including large events ( M > 7.0). It also gives us an opportunity to check the shape of true input correlation models in stochastic modeling after being deformed for permissibility. We expect that this type of modeling will improve our ability to simulate a wide range of rupture scenario models and thereby predict ground motions and perform seismic hazard assessment more accurately.
Multi-dimensional simulation package for ultrashort pulse laser-matter interactions
NASA Astrophysics Data System (ADS)
Suslova, Anastassiya; Hassanein, Ahmed
2017-10-01
Advanced simulation models recently became a popular tool of investigation of ultrashort pulse lasers (USPLs) to enhance understanding of the physics and allow minimizing the experimental costs for optimization of laser and target parameters for various applications. Our research interest is focused on developing multi-dimensional simulation package FEMTO-2D to investigate the USPL-matter interactions and laser induced effects. The package is based on solution of two heat conduction equations for electron and lattice sub-systems - enhanced two temperature model (TTM). We have implemented theoretical approach based on the collision theory to define the thermal dependence of target material optical properties and thermodynamic parameters. Our approach allowed elimination of fitted parameters commonly used in TTM based simulations. FEMTO-2D is used to simulated the light absorption and interactions for several metallic targets as a function of wavelength and pulse duration for wide range of laser intensity. The package has capability to consider different angles of incidence and polarization. It has also been used to investigate the damage threshold of the gold coated optical components with the focus on the role of the film thickness and substrate heat sink effect. This work was supported by the NSF, PIRE project.
Hallock, Michael J.; Stone, John E.; Roberts, Elijah; Fry, Corey; Luthey-Schulten, Zaida
2014-01-01
Simulation of in vivo cellular processes with the reaction-diffusion master equation (RDME) is a computationally expensive task. Our previous software enabled simulation of inhomogeneous biochemical systems for small bacteria over long time scales using the MPD-RDME method on a single GPU. Simulations of larger eukaryotic systems exceed the on-board memory capacity of individual GPUs, and long time simulations of modest-sized cells such as yeast are impractical on a single GPU. We present a new multi-GPU parallel implementation of the MPD-RDME method based on a spatial decomposition approach that supports dynamic load balancing for workstations containing GPUs of varying performance and memory capacity. We take advantage of high-performance features of CUDA for peer-to-peer GPU memory transfers and evaluate the performance of our algorithms on state-of-the-art GPU devices. We present parallel e ciency and performance results for simulations using multiple GPUs as system size, particle counts, and number of reactions grow. We also demonstrate multi-GPU performance in simulations of the Min protein system in E. coli. Moreover, our multi-GPU decomposition and load balancing approach can be generalized to other lattice-based problems. PMID:24882911
Hallock, Michael J; Stone, John E; Roberts, Elijah; Fry, Corey; Luthey-Schulten, Zaida
2014-05-01
Simulation of in vivo cellular processes with the reaction-diffusion master equation (RDME) is a computationally expensive task. Our previous software enabled simulation of inhomogeneous biochemical systems for small bacteria over long time scales using the MPD-RDME method on a single GPU. Simulations of larger eukaryotic systems exceed the on-board memory capacity of individual GPUs, and long time simulations of modest-sized cells such as yeast are impractical on a single GPU. We present a new multi-GPU parallel implementation of the MPD-RDME method based on a spatial decomposition approach that supports dynamic load balancing for workstations containing GPUs of varying performance and memory capacity. We take advantage of high-performance features of CUDA for peer-to-peer GPU memory transfers and evaluate the performance of our algorithms on state-of-the-art GPU devices. We present parallel e ciency and performance results for simulations using multiple GPUs as system size, particle counts, and number of reactions grow. We also demonstrate multi-GPU performance in simulations of the Min protein system in E. coli . Moreover, our multi-GPU decomposition and load balancing approach can be generalized to other lattice-based problems.
MCNPX simulation of proton dose distribution in homogeneous and CT phantoms
NASA Astrophysics Data System (ADS)
Lee, C. C.; Lee, Y. J.; Tung, C. J.; Cheng, H. W.; Chao, T. C.
2014-02-01
A dose simulation system was constructed based on the MCNPX Monte Carlo package to simulate proton dose distribution in homogeneous and CT phantoms. Conversion from Hounsfield unit of a patient CT image set to material information necessary for Monte Carlo simulation is based on Schneider's approach. In order to validate this simulation system, inter-comparison of depth dose distributions among those obtained from the MCNPX, GEANT4 and FLUKA codes for a 160 MeV monoenergetic proton beam incident normally on the surface of a homogeneous water phantom was performed. For dose validation within the CT phantom, direct comparison with measurement is infeasible. Instead, this study took the approach to indirectly compare the 50% ranges (R50%) along the central axis by our system to the NIST CSDA ranges for beams with 160 and 115 MeV energies. Comparison result within the homogeneous phantom shows good agreement. Differences of simulated R50% among the three codes are less than 1 mm. For results within the CT phantom, the MCNPX simulated water equivalent Req,50% are compatible with the CSDA water equivalent ranges from the NIST database with differences of 0.7 and 4.1 mm for 160 and 115 MeV beams, respectively.
Intercomparison of 3D pore-scale flow and solute transport simulation methods
Mehmani, Yashar; Schoenherr, Martin; Pasquali, Andrea; ...
2015-09-28
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This paper provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
Intercomparison of 3D pore-scale flow and solute transport simulation methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiaofan; Mehmani, Yashar; Perkins, William A.
2016-09-01
Multiple numerical approaches have been developed to simulate porous media fluid flow and solute transport at the pore scale. These include 1) methods that explicitly model the three-dimensional geometry of pore spaces and 2) methods that conceptualize the pore space as a topologically consistent set of stylized pore bodies and pore throats. In previous work we validated a model of the first type, using computational fluid dynamics (CFD) codes employing a standard finite volume method (FVM), against magnetic resonance velocimetry (MRV) measurements of pore-scale velocities. Here we expand that validation to include additional models of the first type based onmore » the lattice Boltzmann method (LBM) and smoothed particle hydrodynamics (SPH), as well as a model of the second type, a pore-network model (PNM). The PNM approach used in the current study was recently improved and demonstrated to accurately simulate solute transport in a two-dimensional experiment. While the PNM approach is computationally much less demanding than direct numerical simulation methods, the effect of conceptualizing complex three-dimensional pore geometries on solute transport in the manner of PNMs has not been fully determined. We apply all four approaches (FVM-based CFD, LBM, SPH and PNM) to simulate pore-scale velocity distributions and (for capable codes) nonreactive solute transport, and intercompare the model results. Comparisons are drawn both in terms of macroscopic variables (e.g., permeability, solute breakthrough curves) and microscopic variables (e.g., local velocities and concentrations). Generally good agreement was achieved among the various approaches, but some differences were observed depending on the model context. The intercomparison work was challenging because of variable capabilities of the codes, and inspired some code enhancements to allow consistent comparison of flow and transport simulations across the full suite of methods. This study provides support for confidence in a variety of pore-scale modeling methods and motivates further development and application of pore-scale simulation methods.« less
NASA Technical Reports Server (NTRS)
Erickson, J. D.; Eckelkamp, R. E.; Barta, D. J.; Dragg, J.; Henninger, D. L. (Principal Investigator)
1996-01-01
This paper examines mission simulation as an approach to develop requirements for automation and robotics for Advanced Life Support Systems (ALSS). The focus is on requirements and applications for command and control, control and monitoring, situation assessment and response, diagnosis and recovery, adaptive planning and scheduling, and other automation applications in addition to mechanized equipment and robotics applications to reduce the excessive human labor requirements to operate and maintain an ALSS. Based on principles of systems engineering, an approach is proposed to assess requirements for automation and robotics using mission simulation tools. First, the story of a simulated mission is defined in terms of processes with attendant types of resources needed, including options for use of automation and robotic systems. Next, systems dynamics models are used in simulation to reveal the implications for selected resource allocation schemes in terms of resources required to complete operational tasks. The simulations not only help establish ALSS design criteria, but also may offer guidance to ALSS research efforts by identifying gaps in knowledge about procedures and/or biophysical processes. Simulations of a planned one-year mission with 4 crewmembers in a Human Rated Test Facility are presented as an approach to evaluation of mission feasibility and definition of automation and robotics requirements.
Comparison of an Agent-based Model of Disease Propagation with the Generalised SIR Epidemic Model
2009-08-01
has become a practical method for conducting Epidemiological Modelling. In the agent- based approach the whole township can be modelled as a system of...SIR system was initially developed based on a very simplified model of social interaction. For instance an assumption of uniform population mixing was...simulating the progress of a disease within a host and of transmission between hosts is based upon Transportation Analysis and Simulation System
Large perturbation flow field analysis and simulation for supersonic inlets
NASA Technical Reports Server (NTRS)
Varner, M. O.; Martindale, W. R.; Phares, W. J.; Kneile, K. R.; Adams, J. C., Jr.
1984-01-01
An analysis technique for simulation of supersonic mixed compression inlets with large flow field perturbations is presented. The approach is based upon a quasi-one-dimensional inviscid unsteady formulation which includes engineering models of unstart/restart, bleed, bypass, and geometry effects. Numerical solution of the governing time dependent equations of motion is accomplished through a shock capturing finite difference algorithm, of which five separate approaches are evaluated. Comparison with experimental supersonic wind tunnel data is presented to verify the present approach for a wide range of transient inlet flow conditions.
Vibronic coupling simulations for linear and nonlinear optical processes: Theory
NASA Astrophysics Data System (ADS)
Silverstein, Daniel W.; Jensen, Lasse
2012-02-01
A comprehensive vibronic coupling model based on the time-dependent wavepacket approach is derived to simulate linear optical processes, such as one-photon absorbance and resonance Raman scattering, and nonlinear optical processes, such as two-photon absorbance and resonance hyper-Raman scattering. This approach is particularly well suited for combination with first-principles calculations. Expressions for the Franck-Condon terms, and non-Condon effects via the Herzberg-Teller coupling approach in the independent-mode displaced harmonic oscillator model are presented. The significance of each contribution to the different spectral types is discussed briefly.
Drag and drop simulation: from pictures to full three-dimensional simulations
NASA Astrophysics Data System (ADS)
Bergmann, Michel; Iollo, Angelo
2014-11-01
We present a suite of methods to achieve ``drag and drop'' simulation, i.e., to fully automatize the process to perform thee-dimensional flow simulations around a bodies defined by actual images of moving objects. The overall approach requires a skeleton graph generation to get level set function from pictures, optimal transportation to get body velocity on the surface and then flow simulation thanks to a cartesian method based on penalization. We illustrate this paradigm simulating the swimming of a mackerel fish.
Computational design and multiscale modeling of a nanoactuator using DNA actuation.
Hamdi, Mustapha
2009-12-02
Developments in the field of nanobiodevices coupling nanostructures and biological components are of great interest in medical nanorobotics. As the fundamentals of bio/non-bio interaction processes are still poorly understood in the design of these devices, design tools and multiscale dynamics modeling approaches are necessary at the fabrication pre-project stage. This paper proposes a new concept of optimized carbon nanotube based servomotor design for drug delivery and biomolecular transport applications. The design of an encapsulated DNA-multi-walled carbon nanotube actuator is prototyped using multiscale modeling. The system is parametrized by using a quantum level approach and characterized by using a molecular dynamics simulation. Based on the analysis of the simulation results, a servo nanoactuator using ionic current feedback is simulated and analyzed for application as a drug delivery carrier.
NASA Astrophysics Data System (ADS)
Eivazy, Hesameddin; Esmaieli, Kamran; Jean, Raynald
2017-12-01
An accurate characterization and modelling of rock mass geomechanical heterogeneity can lead to more efficient mine planning and design. Using deterministic approaches and random field methods for modelling rock mass heterogeneity is known to be limited in simulating the spatial variation and spatial pattern of the geomechanical properties. Although the applications of geostatistical techniques have demonstrated improvements in modelling the heterogeneity of geomechanical properties, geostatistical estimation methods such as Kriging result in estimates of geomechanical variables that are not fully representative of field observations. This paper reports on the development of 3D models for spatial variability of rock mass geomechanical properties using geostatistical conditional simulation method based on sequential Gaussian simulation. A methodology to simulate the heterogeneity of rock mass quality based on the rock mass rating is proposed and applied to a large open-pit mine in Canada. Using geomechanical core logging data collected from the mine site, a direct and an indirect approach were used to model the spatial variability of rock mass quality. The results of the two modelling approaches were validated against collected field data. The study aims to quantify the risks of pit slope failure and provides a measure of uncertainties in spatial variability of rock mass properties in different areas of the pit.
Ghany, Ahmad; Vassanji, Karim; Kuziemsky, Craig; Keshavjee, Karim
2013-01-01
Electronic prescribing (e-prescribing) is expected to bring many benefits to Canadian healthcare, such as a reduction in errors and adverse drug reactions. As there currently is no functioning e-prescribing system in Canada that is completely electronic, we are unable to evaluate the performance of a live system. An alternative approach is to use simulation modeling for evaluation. We developed two discrete-event simulation models, one of the current handwritten prescribing system and one of a proposed e-prescribing system, to compare the performance of these two systems. We were able to compare the number of processes in each model, workflow efficiency, and the distribution of patients or prescriptions. Although we were able to compare these models to each other, using discrete-event simulation software was challenging. We were limited in the number of variables we could measure. We discovered non-linear processes and feedback loops in both models that could not be adequately represented using discrete-event simulation software. Finally, interactions between entities in both models could not be modeled using this type of software. We have come to the conclusion that a more appropriate approach to modeling both the handwritten and electronic prescribing systems would be to use a complex adaptive systems approach using agent-based modeling or systems-based modeling.
1D kinetic simulations of a short glow discharge in helium
NASA Astrophysics Data System (ADS)
Yuan, Chengxun; Bogdanov, E. A.; Eliseev, S. I.; Kudryavtsev, A. A.
2017-07-01
This paper presents a 1D model of a direct current glow discharge based on the solution of the kinetic Boltzmann equation in the two-term approximation. The model takes into account electron-electron coulomb collisions, the corresponding collision integral is written in both detailed and simplified forms. The Boltzmann equation for electrons is coupled with continuity equations for ions and metastable atoms and the Poisson equation for electric potential. Simulations are carried out self-consistently for the whole length of discharge in helium (from cathode to anode) for cases p = 1 Torr, L = 3.6 cm and p = 20 Torr, L = 1.8 mm, so that pL = 3.6 cm.Torr in both cases. It is shown that simulations based on the kinetic approach give lower values of electron temperature in plasma than fluid simulations. Peaks in spatial differential flux corresponding to the electrons originating from superelastic collisions and Penning ionization were observed in simulations. Different approaches of taking coulomb collisions into account give significantly different values of electron density and electron temperature in plasma. Analysis showed that using a simplified approach gives a non-zero contribution to the electron energy balance, which is comparable to energy losses on elastic and inelastic collisions and leads to significant errors and thus is not recommended.
Towards a 3d Spatial Urban Energy Modelling Approach
NASA Astrophysics Data System (ADS)
Bahu, J.-M.; Koch, A.; Kremers, E.; Murshed, S. M.
2013-09-01
Today's needs to reduce the environmental impact of energy use impose dramatic changes for energy infrastructure and existing demand patterns (e.g. buildings) corresponding to their specific context. In addition, future energy systems are expected to integrate a considerable share of fluctuating power sources and equally a high share of distributed generation of electricity. Energy system models capable of describing such future systems and allowing the simulation of the impact of these developments thus require a spatial representation in order to reflect the local context and the boundary conditions. This paper describes two recent research approaches developed at EIFER in the fields of (a) geo-localised simulation of heat energy demand in cities based on 3D morphological data and (b) spatially explicit Agent-Based Models (ABM) for the simulation of smart grids. 3D city models were used to assess solar potential and heat energy demand of residential buildings which enable cities to target the building refurbishment potentials. Distributed energy systems require innovative modelling techniques where individual components are represented and can interact. With this approach, several smart grid demonstrators were simulated, where heterogeneous models are spatially represented. Coupling 3D geodata with energy system ABMs holds different advantages for both approaches. On one hand, energy system models can be enhanced with high resolution data from 3D city models and their semantic relations. Furthermore, they allow for spatial analysis and visualisation of the results, with emphasis on spatially and structurally correlations among the different layers (e.g. infrastructure, buildings, administrative zones) to provide an integrated approach. On the other hand, 3D models can benefit from more detailed system description of energy infrastructure, representing dynamic phenomena and high resolution models for energy use at component level. The proposed modelling strategies conceptually and practically integrate urban spatial and energy planning approaches. The combined modelling approach that will be developed based on the described sectorial models holds the potential to represent hybrid energy systems coupling distributed generation of electricity with thermal conversion systems.
Taylor, Charles J.; Williamson, Tanja N.; Newson, Jeremy K.; Ulery, Randy L.; Nelson, Hugh L.; Cinotto, Peter J.
2012-01-01
This report describes Phase II modifications made to the Water Availability Tool for Environmental Resources (WATER), which applies the process-based TOPMODEL approach to simulate or predict stream discharge in surface basins in the Commonwealth of Kentucky. The previous (Phase I) version of WATER did not provide a means of identifying sinkhole catchments or accounting for the effects of karst (internal) drainage in a TOPMODEL-simulated basin. In the Phase II version of WATER, sinkhole catchments are automatically identified and delineated as internally drained subbasins, and a modified TOPMODEL approach (called the sinkhole drainage process, or SDP-TOPMODEL) is applied that calculates mean daily discharges for the basin based on summed area-weighted contributions from sinkhole drain-age (SD) areas and non-karstic topographically drained (TD) areas. Results obtained using the SDP-TOPMODEL approach were evaluated for 12 karst test basins located in each of the major karst terrains in Kentucky. Visual comparison of simulated hydrographs and flow-duration curves, along with statistical measures applied to the simulated discharge data (bias, correlation, root mean square error, and Nash-Sutcliffe efficiency coefficients), indicate that the SDPOPMODEL approach provides acceptably accurate estimates of discharge for most flow conditions and typically provides more accurate simulation of stream discharge in karstic basins compared to the standard TOPMODEL approach. Additional programming modifications made to the Phase II version of WATER included implementation of a point-and-click graphical user interface (GUI), which fully automates the delineation of simulation-basin boundaries and improves the speed of input-data processing. The Phase II version of WATER enables the user to select a pour point anywhere on a stream reach of interest, and the program will automatically delineate all upstream areas that contribute drainage to that point. This capability enables automatic delineation of a simulation basin of any size (area) and having any level of stream-network complexity. WATER then automatically identifies the presence of sinkholes catchments within the simulation basin boundaries; extracts and compiles the necessary climatic, topographic, and basin characteristics datasets; and runs the SDP-TOPMODEL approach to estimate daily mean discharges (streamflow).
P-8A Poseidon strategy for modeling & simulation verification validation & accreditation (VV&A)
NASA Astrophysics Data System (ADS)
Kropp, Derek L.
2009-05-01
One of the first challenges in addressing the need for Modeling & Simulation (M&S) Verification, Validation, & Accreditation (VV&A) is to develop an approach for applying structured and formalized VV&A processes. The P-8A Poseidon Multi-Mission Maritime Aircraft (MMA) Program Modeling and Simulation Accreditation Strategy documents the P-8A program's approach to VV&A. The P-8A strategy tailors a risk-based approach and leverages existing bodies of knowledge, such as the Defense Modeling and Simulation Office Recommended Practice Guide (DMSO RPG), to make the process practical and efficient. As the program progresses, the M&S team must continue to look for ways to streamline the process, add supplemental steps to enhance the process, and identify and overcome procedural, organizational, and cultural challenges. This paper includes some of the basics of the overall strategy, examples of specific approaches that have worked well, and examples of challenges that the M&S team has faced.
A taxonomy-based approach to shed light on the babel of mathematical analogies for rice simulation
USDA-ARS?s Scientific Manuscript database
For most biophysical domains, different models are available and the extent to which their structures differ with respect to differences in outputs was never quantified. We use a taxonomy-based approach to address the question with thirteen rice models. Classification keys and binary attributes for ...
Palmer, T. N.
2014-01-01
This paper sets out a new methodological approach to solving the equations for simulating and predicting weather and climate. In this approach, the conventionally hard boundary between the dynamical core and the sub-grid parametrizations is blurred. This approach is motivated by the relatively shallow power-law spectrum for atmospheric energy on scales of hundreds of kilometres and less. It is first argued that, because of this, the closure schemes for weather and climate simulators should be based on stochastic–dynamic systems rather than deterministic formulae. Second, as high-wavenumber elements of the dynamical core will necessarily inherit this stochasticity during time integration, it is argued that the dynamical core will be significantly over-engineered if all computations, regardless of scale, are performed completely deterministically and if all variables are represented with maximum numerical precision (in practice using double-precision floating-point numbers). As the era of exascale computing is approached, an energy- and computationally efficient approach to cloud-resolved weather and climate simulation is described where determinism and numerical precision are focused on the largest scales only. PMID:24842038
Palmer, T N
2014-06-28
This paper sets out a new methodological approach to solving the equations for simulating and predicting weather and climate. In this approach, the conventionally hard boundary between the dynamical core and the sub-grid parametrizations is blurred. This approach is motivated by the relatively shallow power-law spectrum for atmospheric energy on scales of hundreds of kilometres and less. It is first argued that, because of this, the closure schemes for weather and climate simulators should be based on stochastic-dynamic systems rather than deterministic formulae. Second, as high-wavenumber elements of the dynamical core will necessarily inherit this stochasticity during time integration, it is argued that the dynamical core will be significantly over-engineered if all computations, regardless of scale, are performed completely deterministically and if all variables are represented with maximum numerical precision (in practice using double-precision floating-point numbers). As the era of exascale computing is approached, an energy- and computationally efficient approach to cloud-resolved weather and climate simulation is described where determinism and numerical precision are focused on the largest scales only.
A Dynamic Health Assessment Approach for Shearer Based on Artificial Immune Algorithm
Wang, Zhongbin; Xu, Xihua; Si, Lei; Ji, Rui; Liu, Xinhua; Tan, Chao
2016-01-01
In order to accurately identify the dynamic health of shearer, reducing operating trouble and production accident of shearer and improving coal production efficiency further, a dynamic health assessment approach for shearer based on artificial immune algorithm was proposed. The key technologies such as system framework, selecting the indicators for shearer dynamic health assessment, and health assessment model were provided, and the flowchart of the proposed approach was designed. A simulation example, with an accuracy of 96%, based on the collected data from industrial production scene was provided. Furthermore, the comparison demonstrated that the proposed method exhibited higher classification accuracy than the classifiers based on back propagation-neural network (BP-NN) and support vector machine (SVM) methods. Finally, the proposed approach was applied in an engineering problem of shearer dynamic health assessment. The industrial application results showed that the paper research achievements could be used combining with shearer automation control system in fully mechanized coal face. The simulation and the application results indicated that the proposed method was feasible and outperforming others. PMID:27123002
2014-01-01
This paper analyses how different coordination modes and different multiobjective decision making approaches interfere with each other in hierarchical organizations. The investigation is based on an agent-based simulation. We apply a modified NK-model in which we map multiobjective decision making as adaptive walk on multiple performance landscapes, whereby each landscape represents one objective. We find that the impact of the coordination mode on the performance and the speed of performance improvement is critically affected by the selected multiobjective decision making approach. In certain setups, the performances achieved with the more complex multiobjective decision making approaches turn out to be less sensitive to the coordination mode than the performances achieved with the less complex multiobjective decision making approaches. Furthermore, we present results on the impact of the nature of interactions among decisions on the achieved performance in multiobjective setups. Our results give guidance on how to control the performance contribution of objectives to overall performance and answer the question how effective certain multiobjective decision making approaches perform under certain circumstances (coordination mode and interdependencies among decisions). PMID:25152926
Levett-Jones, Tracy; Andersen, Patrea; Reid-Searl, Kerry; Guinea, Stephen; McAllister, Margaret; Lapkin, Samuel; Palmer, Lorinda; Niddrie, Marian
2015-09-01
Active participation in immersive simulation experiences can result in technical and non-technical skill enhancement. However, when simulations are conducted in large groups, maintaining the interest of observers so that they do not disengage from the learning experience can be challenging. We implemented Tag Team Simulation with the aim of ensuring that both participants and observers had active and integral roles in the simulation. In this paper we outline the features of this innovative approach and provide an example of its application to a pain simulation. Evaluation was conducted using the Satisfaction with Simulation Experience Scale. A total of 444 year nursing students participated from a population of 536 (response rate 83%). Cronbach's alpha for the Scale was .94 indicating high internal consistency. The mean satisfaction score for participants was 4.63 compared to 4.56 for observers. An independent sample t test revealed no significant difference between these scores (t (300) = -1.414, p = 0.16). Tag team simulation is an effective approach for ensuring observers' and participants' active involvement during group-based simulations and one that is highly regarded by students. It has the potential for broad applicability across a range of leaning domains both within and beyond nursing. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kong, Bo; Fox, Rodney O.; Feng, Heng
An Euler–Euler anisotropic Gaussian approach (EE-AG) for simulating gas–particle flows, in which particle velocities are assumed to follow a multivariate anisotropic Gaussian distribution, is used to perform mesoscale simulations of homogeneous cluster-induced turbulence (CIT). A three-dimensional Gauss–Hermite quadrature formulation is used to calculate the kinetic flux for 10 velocity moments in a finite-volume framework. The particle-phase volume-fraction and momentum equations are coupled with the Eulerian solver for the gas phase. This approach is implemented in an open-source CFD package, OpenFOAM, and detailed simulation results are compared with previous Euler–Lagrange simulations in a domain size study of CIT. Here, these resultsmore » demonstrate that the proposed EE-AG methodology is able to produce comparable results to EL simulations, and this moment-based methodology can be used to perform accurate mesoscale simulations of dilute gas–particle flows.« less
Kong, Bo; Fox, Rodney O.; Feng, Heng; ...
2017-02-16
An Euler–Euler anisotropic Gaussian approach (EE-AG) for simulating gas–particle flows, in which particle velocities are assumed to follow a multivariate anisotropic Gaussian distribution, is used to perform mesoscale simulations of homogeneous cluster-induced turbulence (CIT). A three-dimensional Gauss–Hermite quadrature formulation is used to calculate the kinetic flux for 10 velocity moments in a finite-volume framework. The particle-phase volume-fraction and momentum equations are coupled with the Eulerian solver for the gas phase. This approach is implemented in an open-source CFD package, OpenFOAM, and detailed simulation results are compared with previous Euler–Lagrange simulations in a domain size study of CIT. Here, these resultsmore » demonstrate that the proposed EE-AG methodology is able to produce comparable results to EL simulations, and this moment-based methodology can be used to perform accurate mesoscale simulations of dilute gas–particle flows.« less
Landsgesell, Jonas; Holm, Christian; Smiatek, Jens
2017-02-14
We present a novel method for the study of weak polyelectrolytes and general acid-base reactions in molecular dynamics and Monte Carlo simulations. The approach combines the advantages of the reaction ensemble and the Wang-Landau sampling method. Deprotonation and protonation reactions are simulated explicitly with the help of the reaction ensemble method, while the accurate sampling of the corresponding phase space is achieved by the Wang-Landau approach. The combination of both techniques provides a sufficient statistical accuracy such that meaningful estimates for the density of states and the partition sum can be obtained. With regard to these estimates, several thermodynamic observables like the heat capacity or reaction free energies can be calculated. We demonstrate that the computation times for the calculation of titration curves with a high statistical accuracy can be significantly decreased when compared to the original reaction ensemble method. The applicability of our approach is validated by the study of weak polyelectrolytes and their thermodynamic properties.
An approach for drag correction based on the local heterogeneity for gas-solid flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Tingwen; Wang, Limin; Rogers, William
2016-09-22
The drag models typically used for gas-solids interaction are mainly developed based on homogeneous systems of flow passing fixed particle assembly. It has been shown that the heterogeneous structures, i.e., clusters and bubbles in fluidized beds, need to be resolved to account for their effect in the numerical simulations. Since the heterogeneity is essentially captured through the local concentration gradient in the computational cells, this study proposes a simple approach to account for the non-uniformity of solids spatial distribution inside a computational cell and its effect on the interaction between gas and solid phases. Finally, to validate this approach, themore » predicted drag coefficient has been compared to the results from direct numerical simulations. In addition, the need to account for this type of heterogeneity is discussed for a periodic riser flow simulation with highly resolved numerical grids and the impact of the proposed correction for drag is demonstrated.« less
Voidage correction algorithm for unresolved Euler-Lagrange simulations
NASA Astrophysics Data System (ADS)
Askarishahi, Maryam; Salehi, Mohammad-Sadegh; Radl, Stefan
2018-04-01
The effect of grid coarsening on the predicted total drag force and heat exchange rate in dense gas-particle flows is investigated using Euler-Lagrange (EL) approach. We demonstrate that grid coarsening may reduce the predicted total drag force and exchange rate. Surprisingly, exchange coefficients predicted by the EL approach deviate more significantly from the exact value compared to results of Euler-Euler (EE)-based calculations. The voidage gradient is identified as the root cause of this peculiar behavior. Consequently, we propose a correction algorithm based on a sigmoidal function to predict the voidage experienced by individual particles. Our correction algorithm can significantly improve the prediction of exchange coefficients in EL models, which is tested for simulations involving Euler grid cell sizes between 2d_p and 12d_p . It is most relevant in simulations of dense polydisperse particle suspensions featuring steep voidage profiles. For these suspensions, classical approaches may result in an error of the total exchange rate of up to 30%.
ERIC Educational Resources Information Center
Pagnotti, John; Russell, William B., III
2015-01-01
The purpose of this article is to empower those interested in teaching students powerful and engaging social studies. Through the lens of Supreme Court simulations, this article provides educators with a viable, classroom-tested lesson plan to bring Problem-Based Learning into their classrooms. The specific aim of the lesson is to provide students…
Performance Evaluation of the Approaches and Algorithms for Hamburg Airport Operations
NASA Technical Reports Server (NTRS)
Zhu, Zhifan; Jung, Yoon; Lee, Hanbong; Schier, Sebastian; Okuniek, Nikolai; Gerdes, Ingrid
2016-01-01
In this work, fast-time simulations have been conducted using SARDA tools at Hamburg airport by NASA and real-time simulations using CADEO and TRACC with the NLR ATM Research Simulator (NARSIM) by DLR. The outputs are analyzed using a set of common metrics collaborated between DLR and NASA. The proposed metrics are derived from International Civil Aviation Organization (ICAO)s Key Performance Areas (KPAs) in capability, efficiency, predictability and environment, and adapted to simulation studies. The results are examined to explore and compare the merits and shortcomings of the two approaches using the common performance metrics. Particular attention is paid to the concept of the close-loop, trajectory-based taxi as well as the application of US concept to the European airport. Both teams consider the trajectory-based surface operation concept a critical technology advance in not only addressing the current surface traffic management problems, but also having potential application in unmanned vehicle maneuver on airport surface, such as autonomous towing or TaxiBot [6][7] and even Remote Piloted Aircraft (RPA). Based on this work, a future integration of TRACC and SOSS is described aiming at bringing conflict-free trajectory-based operation concept to US airport.
Combined PEST and Trial-Error approach to improve APEX calibration
USDA-ARS?s Scientific Manuscript database
The Agricultural Policy Environmental eXtender (APEX), a physically-based hydrologic model that simulates management impacts on the environment for small watersheds, requires improved understanding of the input parameters for improved simulations. However, most previously published studies used the ...
Numerically Simulating Collisions of Plastic and Foam Laser-Driven Foils
NASA Astrophysics Data System (ADS)
Zalesak, S. T.; Velikovich, A. L.; Schmitt, A. J.; Aglitskiy, Y.; Metzler, N.
2007-11-01
Interest in experiments on colliding planar foils has recently been stimulated by (a) the Impact Fast Ignition approach to laser fusion [1], and (b) the approach to a high-repetition rate ignition facility based on direct drive with the KrF laser [2]. Simulating the evolution of perturbations to such foils can be a numerical challenge, especially if the initial perturbation amplitudes are small. We discuss the numerical issues involved in such simulations, describe their benchmarking against recently-developed analytic results, and present simulations of such experiments on NRL's Nike laser. [1] M. Murakami et al., Nucl. Fusion 46, 99 (2006) [2] S. P. Obenschain et al., Phys. Plasmas 13, 056320 (2006).
Risk Reduction and Training using Simulation Based Tools - 12180
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Irin P.
2012-07-01
Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and Smore » based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition. Integrating these tools into a larger virtual system provides a tool for making larger strategic decisions. The key to integrating and creating these virtual environments is the software and the process used to build them. Although important steps in the direction of using simulation based tools for nuclear domain, the applications described here represent only a small cross section of possible benefits. The next generation of applications will, likely, focus on situational awareness and adaptive planning. Situational awareness refers to the ability to visualize in real time the state of operations. Some useful tools in this area are Geographic Information Systems (GIS), which help monitor and analyze geographically referenced information. Combined with such situational awareness capability, simulation tools can serve as the platform for adaptive planning tools. These are the tools that allow the decision maker to react to the changing environment in real time by synthesizing massive amounts of data into easily understood information. For the nuclear domains, this may mean creation of Virtual Nuclear Systems, from Virtual Waste Processing Plants to Virtual Nuclear Reactors. (authors)« less
Vision Research for Flight Simulation. Final Report.
ERIC Educational Resources Information Center
Richards, Whitman, Ed.; Dismukes, Key, Ed.
Based on a workshop on vision research issues in flight-training simulators held in June 1980, this report focuses on approaches for the conduct of research on what visual information is needed for simulation and how it can best be presented. An introduction gives an overview of the workshop and describes the contents of the report. Section 1…
Becky K. Kerns; Miles A. Hemstrom; David Conklin; Gabriel I. Yospin; Bart Johnson; Dominique Bachelet; Scott Bridgham
2012-01-01
Understanding landscape vegetation dynamics often involves the use of scientifically-based modeling tools that are capable of testing alternative management scenarios given complex ecological, management, and social conditions. State-and-transition simulation model (STSM) frameworks and software such as PATH and VDDT are commonly used tools that simulate how landscapes...
NASA Technical Reports Server (NTRS)
Wanke, Craig; Kuchar, James; Hahn, Edward; Pritchett, A.; Hansman, R. John
1994-01-01
Advances in avionics and display technology are significantly changing the cockpit environment in current transport aircraft. The MIT Aeronautical Systems Lab (ASL) developed a part-task flight simulator specifically to study the effects of these new technologies on flight crew situational awareness and performance. The simulator is based on a commercially-available graphics workstation, and can be rapidly reconfigured to meet the varying demands of experimental studies. The simulator was successfully used to evaluate graphical microbursts alerting displays, electronic instrument approach plates, terrain awareness and alerting displays, and ATC routing amendment delivery through digital datalinks.
Petsev, Nikolai Dimitrov; Leal, L. Gary; Shell, M. Scott
2017-12-21
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely-resolved (e.g. molecular dynamics) and coarse-grained (e.g. continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 84115 (2016)], simulatedmore » using a particle-based continuum method known as smoothed dissipative particle dynamics (SDPD). An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.« less
Study of hypervelocity projectile impact on thick metal plates
Roy, Shawoon K.; Trabia, Mohamed; O’Toole, Brendan; ...
2016-01-01
Hypervelocity impacts generate extreme pressure and shock waves in impacted targets that undergo severe localized deformation within a few microseconds. These impact experiments pose unique challenges in terms of obtaining accurate measurements. Similarly, simulating these experiments is not straightforward. This paper proposed an approach to experimentally measure the velocity of the back surface of an A36 steel plate impacted by a projectile. All experiments used a combination of a two-stage light-gas gun and the photonic Doppler velocimetry (PDV) technique. The experimental data were used to benchmark and verify computational studies. Two different finite-element methods were used to simulate the experiments:more » Lagrangian-based smooth particle hydrodynamics (SPH) and Eulerian-based hydrocode. Both codes used the Johnson-Cook material model and the Mie-Grüneisen equation of state. Experiments and simulations were compared based on the physical damage area and the back surface velocity. Finally, the results of this study showed that the proposed simulation approaches could be used to reduce the need for expensive experiments.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petsev, Nikolai Dimitrov; Leal, L. Gary; Shell, M. Scott
Hybrid molecular-continuum simulation techniques afford a number of advantages for problems in the rapidly burgeoning area of nanoscale engineering and technology, though they are typically quite complex to implement and limited to single-component fluid systems. We describe an approach for modeling multicomponent hydrodynamic problems spanning multiple length scales when using particle-based descriptions for both the finely-resolved (e.g. molecular dynamics) and coarse-grained (e.g. continuum) subregions within an overall simulation domain. This technique is based on the multiscale methodology previously developed for mesoscale binary fluids [N. D. Petsev, L. G. Leal, and M. S. Shell, J. Chem. Phys. 144, 84115 (2016)], simulatedmore » using a particle-based continuum method known as smoothed dissipative particle dynamics (SDPD). An important application of this approach is the ability to perform coupled molecular dynamics (MD) and continuum modeling of molecularly miscible binary mixtures. In order to validate this technique, we investigate multicomponent hybrid MD-continuum simulations at equilibrium, as well as non-equilibrium cases featuring concentration gradients.« less
Onan, Arif; Simsek, Nurettin; Elcin, Melih; Turan, Sevgi; Erbil, Bülent; Deniz, Kaan Zülfikar
2017-11-01
Cardiopulmonary resuscitation training is an essential element of clinical skill development for healthcare providers. The International Liaison Committee on Resuscitation has described issues related to cardiopulmonary resuscitation and emergency cardiovascular care education. Educational interventions have been initiated to try to address these issues using a team-based approach and simulation technologies that offer a controlled, safe learning environment. The aim of the study is to review and synthesize published studies that address the primary question "What are the features and effectiveness of educational interventions related to simulation-enhanced, team-based cardiopulmonary resuscitation training?" We conducted a systematic review focused on educational interventions pertaining to cardiac arrest and emergencies that addressed this main question. The findings are presented together with a discussion of the effectiveness of various educational interventions. In conclusion, student attitudes toward interprofessional learning and simulation experiences were more positive. Research reports emphasized the importance of adherence to established guidelines, adopting a holistic approach to training, and that preliminary training, briefing, deliberate practices, and debriefing should help to overcome deficiencies in cardiopulmonary resuscitation training. Copyright © 2017 Elsevier Ltd. All rights reserved.
Integrated simulation of continuous-scale and discrete-scale radiative transfer in metal foams
NASA Astrophysics Data System (ADS)
Xia, Xin-Lin; Li, Yang; Sun, Chuang; Ai, Qing; Tan, He-Ping
2018-06-01
A novel integrated simulation of radiative transfer in metal foams is presented. It integrates the continuous-scale simulation with the direct discrete-scale simulation in a single computational domain. It relies on the coupling of the real discrete-scale foam geometry with the equivalent continuous-scale medium through a specially defined scale-coupled zone. This zone holds continuous but nonhomogeneous volumetric radiative properties. The scale-coupled approach is compared to the traditional continuous-scale approach using volumetric radiative properties in the equivalent participating medium and to the direct discrete-scale approach employing the real 3D foam geometry obtained by computed tomography. All the analyses are based on geometrical optics. The Monte Carlo ray-tracing procedure is used for computations of the absorbed radiative fluxes and the apparent radiative behaviors of metal foams. The results obtained by the three approaches are in tenable agreement. The scale-coupled approach is fully validated in calculating the apparent radiative behaviors of metal foams composed of very absorbing to very reflective struts and that composed of very rough to very smooth struts. This new approach leads to a reduction in computational time by approximately one order of magnitude compared to the direct discrete-scale approach. Meanwhile, it can offer information on the local geometry-dependent feature and at the same time the equivalent feature in an integrated simulation. This new approach is promising to combine the advantages of the continuous-scale approach (rapid calculations) and direct discrete-scale approach (accurate prediction of local radiative quantities).
NASA Astrophysics Data System (ADS)
McKane, R. B.; M, S.; F, P.; Kwiatkowski, B. L.; Rastetter, E. B.
2006-12-01
Federal and state agencies responsible for protecting water quality rely mainly on statistically-based methods to assess and manage risks to the nation's streams, lakes and estuaries. Although statistical approaches provide valuable information on current trends in water quality, process-based simulation models are essential for understanding and forecasting how changes in human activities across complex landscapes impact the transport of nutrients and contaminants to surface waters. To address this need, we developed a broadly applicable, process-based watershed simulator that links a spatially-explicit hydrologic model and a terrestrial biogeochemistry model (MEL). See Stieglitz et al. and Pan et al., this meeting, for details on the design and verification of this simulator. Here we apply the watershed simulator to a generalized agricultural setting to demonstrate its potential for informing policy and management decisions concerning water quality. This demonstration specifically explores the effectiveness of riparian buffers for reducing the transport of nitrogenous fertilizers from agricultural fields to streams. The interaction of hydrologic and biogeochemical processes represented in our simulator allows several important questions to be addressed. (1) For a range of upland fertilization rates, to what extent do riparian buffers reduce nitrogen inputs to streams? (2) How does buffer effectiveness change over time as the plant-soil system approaches N-saturation? (3) How can buffers be managed to increase their effectiveness, e.g., through periodic harvest and replanting? The model results illustrate that, while the answers to these questions depend to some extent on site factors (climatic regime, soil properties and vegetation type), in all cases riparian buffers have a limited capacity to reduce nitrogen inputs to streams where fertilization rates approach those typically used for intensive agriculture (e.g., 200 kg N per ha per year for corn in the U.S.A. Midwestern states). We also discuss how the insights gained from our approach cannot be achieved with modeling tools that are not both spatially explicit and process-based.
NASA Technical Reports Server (NTRS)
Richardson, Brian; Kenny, Jeremy
2015-01-01
Injector design is a critical part of the development of a rocket Thrust Chamber Assembly (TCA). Proper detailed injector design can maximize propulsion efficiency while minimizing the potential for failures in the combustion chamber. Traditional design and analysis methods for hydrocarbon-fuel injector elements are based heavily on empirical data and models developed from heritage hardware tests. Using this limited set of data produces challenges when trying to design a new propulsion system where the operating conditions may greatly differ from heritage applications. Time-accurate, Three-Dimensional (3-D) Computational Fluid Dynamics (CFD) modeling of combusting flows inside of injectors has long been a goal of the fluid analysis group at Marshall Space Flight Center (MSFC) and the larger CFD modeling community. CFD simulation can provide insight into the design and function of an injector that cannot be obtained easily through testing or empirical comparisons to existing hardware. However, the traditional finite-rate chemistry modeling approach utilized to simulate combusting flows for complex fuels, such as Rocket Propellant-2 (RP-2), is prohibitively expensive and time consuming even with a large amount of computational resources. MSFC has been working, in partnership with Streamline Numerics, Inc., to develop a computationally efficient, flamelet-based approach for modeling complex combusting flow applications. In this work, a flamelet modeling approach is used to simulate time-accurate, 3-D, combusting flow inside a single Gas Centered Swirl Coaxial (GCSC) injector using the flow solver, Loci-STREAM. CFD simulations were performed for several different injector geometries. Results of the CFD analysis helped guide the design of the injector from an initial concept to a tested prototype. The results of the CFD analysis are compared to data gathered from several hot-fire, single element injector tests performed in the Air Force Research Lab EC-1 test facility located at Edwards Air Force Base.
Collaborative voxel-based surgical virtual environments.
Acosta, Eric; Muniz, Gilbert; Armonda, Rocco; Bowyer, Mark; Liu, Alan
2008-01-01
Virtual Reality-based surgical simulators can utilize Collaborative Virtual Environments (C-VEs) to provide team-based training. To support real-time interactions, C-VEs are typically replicated on each user's local computer and a synchronization method helps keep all local copies consistent. This approach does not work well for voxel-based C-VEs since large and frequent volumetric updates make synchronization difficult. This paper describes a method that allows multiple users to interact within a voxel-based C-VE for a craniotomy simulator being developed. Our C-VE method requires smaller update sizes and provides faster synchronization update rates than volumetric-based methods. Additionally, we address network bandwidth/latency issues to simulate networked haptic and bone drilling tool interactions with a voxel-based skull C-VE.
NASA Technical Reports Server (NTRS)
Leonard, J. I.; White, R. J.; Rummel, J. A.
1980-01-01
An approach was developed to aid in the integration of many of the biomedical findings of space flight, using systems analysis. The mathematical tools used in accomplishing this task include an automated data base, a biostatistical and data analysis system, and a wide variety of mathematical simulation models of physiological systems. A keystone of this effort was the evaluation of physiological hypotheses using the simulation models and the prediction of the consequences of these hypotheses on many physiological quantities, some of which were not amenable to direct measurement. This approach led to improvements in the model, refinements of the hypotheses, a tentative integrated hypothesis for adaptation to weightlessness, and specific recommendations for new flight experiments.
Simulation methods to estimate design power: an overview for applied research.
Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E
2011-06-20
Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.
Simulation methods to estimate design power: an overview for applied research
2011-01-01
Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447
Atomic-level simulation of ferroelectricity in perovskite solid solutions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sepliarsky, M.; Instituto de Fisica Rosario, CONICET-UNR, Rosario,; Phillpot, S. R.
2000-06-26
Building on the insights gained from electronic-structure calculations and from experience obtained with an earlier atomic-level method, we developed an atomic-level simulation approach based on the traditional Buckingham potential with shell model which correctly reproduces the ferroelectric phase behavior and dielectric and piezoelectric properties of KNbO{sub 3}. This approach now enables the simulation of solid solutions and defected systems; we illustrate this capability by elucidating the ferroelectric properties of a KTa{sub 0.5}Nb{sub 0.5}O{sub 3} random solid solution. (c) 2000 American Institute of Physics.
Hierarchical analytical and simulation modelling of human-machine systems with interference
NASA Astrophysics Data System (ADS)
Braginsky, M. Ya; Tarakanov, D. V.; Tsapko, S. G.; Tsapko, I. V.; Baglaeva, E. A.
2017-01-01
The article considers the principles of building the analytical and simulation model of the human operator and the industrial control system hardware and software. E-networks as the extension of Petri nets are used as the mathematical apparatus. This approach allows simulating complex parallel distributed processes in human-machine systems. The structural and hierarchical approach is used as the building method for the mathematical model of the human operator. The upper level of the human operator is represented by the logical dynamic model of decision making based on E-networks. The lower level reflects psychophysiological characteristics of the human-operator.
NASA Astrophysics Data System (ADS)
Yusof, Muhammad Mat; Sulaiman, Tajularipin; Khalid, Ruzelan; Hamid, Mohamad Shukri Abdul; Mansor, Rosnalini
2014-12-01
In professional sporting events, rating competitors before tournament start is a well-known approach to distinguish the favorite team and the weaker teams. Various methodologies are used to rate competitors. In this paper, we explore four ways to rate competitors; least squares rating, maximum likelihood strength ratio, standing points in large round robin simulation and previous league rank position. The tournament metric we used to evaluate different types of rating approach is tournament outcome characteristics measure. The tournament outcome characteristics measure is defined by the probability that a particular team in the top 100q pre-tournament rank percentile progress beyond round R, for all q and R. Based on simulation result, we found that different rating approach produces different effect to the team. Our simulation result shows that from eight teams participate in knockout standard seeding, Perak has highest probability to win for tournament that use the least squares rating approach, PKNS has highest probability to win using the maximum likelihood strength ratio and the large round robin simulation approach, while Perak has the highest probability to win a tournament using previous league season approach.
Genetic Algorithms and Their Application to the Protein Folding Problem
1993-12-01
and symbolic methods, random methods such as Monte Carlo simulation and simulated annealing, distance geometry, and molecular dynamics. Many of these...calculated energies with those obtained using the molecular simulation software package called CHARMm. 10 9) Test both the simple and parallel simpie genetic...homology-based, and simplification techniques. 3.21 Molecular Dynamics. Perhaps the most natural approach is to actually simulate the folding process. This
Technical Note: Approximate Bayesian parameterization of a complex tropical forest model
NASA Astrophysics Data System (ADS)
Hartig, F.; Dislich, C.; Wiegand, T.; Huth, A.
2013-08-01
Inverse parameter estimation of process-based models is a long-standing problem in ecology and evolution. A key problem of inverse parameter estimation is to define a metric that quantifies how well model predictions fit to the data. Such a metric can be expressed by general cost or objective functions, but statistical inversion approaches are based on a particular metric, the probability of observing the data given the model, known as the likelihood. Deriving likelihoods for dynamic models requires making assumptions about the probability for observations to deviate from mean model predictions. For technical reasons, these assumptions are usually derived without explicit consideration of the processes in the simulation. Only in recent years have new methods become available that allow generating likelihoods directly from stochastic simulations. Previous applications of these approximate Bayesian methods have concentrated on relatively simple models. Here, we report on the application of a simulation-based likelihood approximation for FORMIND, a parameter-rich individual-based model of tropical forest dynamics. We show that approximate Bayesian inference, based on a parametric likelihood approximation placed in a conventional MCMC, performs well in retrieving known parameter values from virtual field data generated by the forest model. We analyze the results of the parameter estimation, examine the sensitivity towards the choice and aggregation of model outputs and observed data (summary statistics), and show results from using this method to fit the FORMIND model to field data from an Ecuadorian tropical forest. Finally, we discuss differences of this approach to Approximate Bayesian Computing (ABC), another commonly used method to generate simulation-based likelihood approximations. Our results demonstrate that simulation-based inference, which offers considerable conceptual advantages over more traditional methods for inverse parameter estimation, can successfully be applied to process-based models of high complexity. The methodology is particularly suited to heterogeneous and complex data structures and can easily be adjusted to other model types, including most stochastic population and individual-based models. Our study therefore provides a blueprint for a fairly general approach to parameter estimation of stochastic process-based models in ecology and evolution.
Improving Protocols for Protein Mapping through Proper Comparison to Crystallography Data
Lexa, Katrina W.; Carlson, Heather A.
2013-01-01
Computational approaches to fragment-based drug design (FBDD) can complement experiments and facilitate the identification of potential hot spots along the protein surface. However, the evaluation of computational methods for mapping binding sites frequently focuses upon the ability to reproduce crystallographic coordinates to within a low RMSD threshold. This dependency on the deposited coordinate data overlooks the original electron density from the experiment, thus techniques may be developed based upon subjective - or even erroneous - atomic coordinates. This can become a significant drawback in applications to systems where the location of hot spots is unknown. Based on comparison to crystallographic density, we previously showed that mixed-solvent molecular dynamics (MixMD) accurately identifies the active site for HEWL, with acetonitrile as an organic solvent. Here, we concentrated on the influence of protic solvent on simulation and refined the optimal MixMD approach for extrapolation of the method to systems without established sites. Our results establish an accurate approach for comparing simulations to experiment. We have outlined the most efficient strategy for MixMD, based on simulation length and number of runs. The development outlined here makes MixMD a robust method which should prove useful across a broad range of target structures. Lastly, our results with MixMD match experimental data so well that consistency between simulations and density may be a useful way to aid the identification of probes vs waters during the refinement of future MSCS crystallographic structures. PMID:23327200
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, Zhiling; Wei, Wei; Turlapaty, Anish
2012-07-01
At the United States Army's test sites, fired penetrators made of Depleted Uranium (DU) have been buried under ground and become hazardous waste. Previously, we developed techniques for detecting buried radioactive targets. We also developed approaches for locating buried paramagnetic metal objects by utilizing the electromagnetic induction (EMI) sensor data. In this paper, we apply data fusion techniques to combine results from both the radiation detection and the EMI detection, so that we can further distinguish among DU penetrators, DU oxide, and non- DU metal debris. We develop a two-step fusion approach for the task, and test it with surveymore » data collected on simulation targets. In this work, we explored radiation and EMI data fusion for detecting DU, oxides, and non-DU metals. We developed a two-step fusion approach based on majority voting and a set of decision rules. With this approach, we fuse results from radiation detection based on the RX algorithm and EMI detection based on a 3-step analysis. Our fusion approach has been tested successfully with data collected on simulation targets. In the future, we will need to further verify the effectiveness of this fusion approach with field data. (authors)« less
NASA Astrophysics Data System (ADS)
Vasco, D. W.
2018-04-01
Following an approach used in quantum dynamics, an exponential representation of the hydraulic head transforms the diffusion equation governing pressure propagation into an equivalent set of ordinary differential equations. Using a reservoir simulator to determine one set of dependent variables leaves a reduced set of equations for the path of a pressure transient. Unlike the current approach for computing the path of a transient, based on a high-frequency asymptotic solution, the trajectories resulting from this new formulation are valid for arbitrary spatial variations in aquifer properties. For a medium containing interfaces and layers with sharp boundaries, the trajectory mechanics approach produces paths that are compatible with travel time fields produced by a numerical simulator, while the asymptotic solution produces paths that bend too strongly into high permeability regions. The breakdown of the conventional asymptotic solution, due to the presence of sharp boundaries, has implications for model parameter sensitivity calculations and the solution of the inverse problem. For example, near an abrupt boundary, trajectories based on the asymptotic approach deviate significantly from regions of high sensitivity observed in numerical computations. In contrast, paths based on the new trajectory mechanics approach coincide with regions of maximum sensitivity to permeability changes.
3D Simulation of External Flooding Events for the RISMC Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prescott, Steven; Mandelli, Diego; Sampath, Ramprasad
2015-09-01
Incorporating 3D simulations as part of the Risk-Informed Safety Margins Characterization (RISMIC) Toolkit allows analysts to obtain a more complete picture of complex system behavior for events including external plant hazards. External events such as flooding have become more important recently – however these can be analyzed with existing and validated simulated physics toolkits. In this report, we describe these approaches specific to flooding-based analysis using an approach called Smoothed Particle Hydrodynamics. The theory, validation, and example applications of the 3D flooding simulation are described. Integrating these 3D simulation methods into computational risk analysis provides a spatial/visual aspect to themore » design, improves the realism of results, and can prove visual understanding to validate the analysis of flooding.« less
Dataflow computing approach in high-speed digital simulation
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Karplus, W. J.
1984-01-01
New computational tools and methodologies for the digital simulation of continuous systems were explored. Programmability, and cost effective performance in multiprocessor organizations for real time simulation was investigated. Approach is based on functional style languages and data flow computing principles, which allow for the natural representation of parallelism in algorithms and provides a suitable basis for the design of cost effective high performance distributed systems. The objectives of this research are to: (1) perform comparative evaluation of several existing data flow languages and develop an experimental data flow language suitable for real time simulation using multiprocessor systems; (2) investigate the main issues that arise in the architecture and organization of data flow multiprocessors for real time simulation; and (3) develop and apply performance evaluation models in typical applications.
Model-based multiple patterning layout decomposition
NASA Astrophysics Data System (ADS)
Guo, Daifeng; Tian, Haitong; Du, Yuelin; Wong, Martin D. F.
2015-10-01
As one of the most promising next generation lithography technologies, multiple patterning lithography (MPL) plays an important role in the attempts to keep in pace with 10 nm technology node and beyond. With feature size keeps shrinking, it has become impossible to print dense layouts within one single exposure. As a result, MPL such as double patterning lithography (DPL) and triple patterning lithography (TPL) has been widely adopted. There is a large volume of literature on DPL/TPL layout decomposition, and the current approach is to formulate the problem as a classical graph-coloring problem: Layout features (polygons) are represented by vertices in a graph G and there is an edge between two vertices if and only if the distance between the two corresponding features are less than a minimum distance threshold value dmin. The problem is to color the vertices of G using k colors (k = 2 for DPL, k = 3 for TPL) such that no two vertices connected by an edge are given the same color. This is a rule-based approach, which impose a geometric distance as a minimum constraint to simply decompose polygons within the distance into different masks. It is not desired in practice because this criteria cannot completely capture the behavior of the optics. For example, it lacks of sufficient information such as the optical source characteristics and the effects between the polygons outside the minimum distance. To remedy the deficiency, a model-based layout decomposition approach to make the decomposition criteria base on simulation results was first introduced at SPIE 2013.1 However, the algorithm1 is based on simplified assumption on the optical simulation model and therefore its usage on real layouts is limited. Recently AMSL2 also proposed a model-based approach to layout decomposition by iteratively simulating the layout, which requires excessive computational resource and may lead to sub-optimal solutions. The approach2 also potentially generates too many stiches. In this paper, we propose a model-based MPL layout decomposition method using a pre-simulated library of frequent layout patterns. Instead of using the graph G in the standard graph-coloring formulation, we build an expanded graph H where each vertex represents a group of adjacent features together with a coloring solution. By utilizing the library and running sophisticated graph algorithms on H, our approach can obtain optimal decomposition results efficiently. Our model-based solution can achieve a practical mask design which significantly improves the lithography quality on the wafer compared to the rule based decomposition.
Menshutkin, V V; Kazanskiĭ, A B; Levchenko, V F
2010-01-01
The history of rise and development of evolutionary methods in Saint Petersburg school of biological modelling is traced and analyzed. Some pioneering works in simulation of ecological and evolutionary processes, performed in St.-Petersburg school became an exemplary ones for many followers in Russia and abroad. The individual-based approach became the crucial point in the history of the school as an adequate instrument for construction of models of biological evolution. This approach is natural for simulation of the evolution of life-history parameters and adaptive processes in populations and communities. In some cases simulated evolutionary process was used for solving a reverse problem, i. e., for estimation of uncertain life-history parameters of population. Evolutionary computations is one more aspect of this approach application in great many fields. The problems and vistas of ecological and evolutionary modelling in general are discussed.
Liu, Peter X.; Lai, Pinhua; Xu, Shaoping; Zou, Yanni
2018-01-01
In the present work, the majority of implemented virtual surgery simulation systems have been based on either a mesh or meshless strategy with regard to soft tissue modelling. To take full advantage of the mesh and meshless models, a novel coupled soft tissue cutting model is proposed. Specifically, the reconstructed virtual soft tissue consists of two essential components. One is associated with surface mesh that is convenient for surface rendering and the other with internal meshless point elements that is used to calculate the force feedback during cutting. To combine two components in a seamless way, virtual points are introduced. During the simulation of cutting, the Bezier curve is used to characterize smooth and vivid incision on the surface mesh. At the same time, the deformation of internal soft tissue caused by cutting operation can be treated as displacements of the internal point elements. Furthermore, we discussed and proved the stability and convergence of the proposed approach theoretically. The real biomechanical tests verified the validity of the introduced model. And the simulation experiments show that the proposed approach offers high computational efficiency and good visual effect, enabling cutting of soft tissue with high stability. PMID:29850006
Virtual commissioning of automated micro-optical assembly
NASA Astrophysics Data System (ADS)
Schlette, Christian; Losch, Daniel; Haag, Sebastian; Zontar, Daniel; Roßmann, Jürgen; Brecher, Christian
2015-02-01
In this contribution, we present a novel approach to enable virtual commissioning for process developers in micro-optical assembly. Our approach aims at supporting micro-optics experts to effectively develop assisted or fully automated assembly solutions without detailed prior experience in programming while at the same time enabling them to easily implement their own libraries of expert schemes and algorithms for handling optical components. Virtual commissioning is enabled by a 3D simulation and visualization system in which the functionalities and properties of automated systems are modeled, simulated and controlled based on multi-agent systems. For process development, our approach supports event-, state- and time-based visual programming techniques for the agents and allows for their kinematic motion simulation in combination with looped-in simulation results for the optical components. First results have been achieved for simply switching the agents to command the real hardware setup after successful process implementation and validation in the virtual environment. We evaluated and adapted our system to meet the requirements set by industrial partners-- laser manufacturers as well as hardware suppliers of assembly platforms. The concept is applied to the automated assembly of optical components for optically pumped semiconductor lasers and positioning of optical components for beam-shaping
Service-based analysis of biological pathways
Zheng, George; Bouguettaya, Athman
2009-01-01
Background Computer-based pathway discovery is concerned with two important objectives: pathway identification and analysis. Conventional mining and modeling approaches aimed at pathway discovery are often effective at achieving either objective, but not both. Such limitations can be effectively tackled leveraging a Web service-based modeling and mining approach. Results Inspired by molecular recognitions and drug discovery processes, we developed a Web service mining tool, named PathExplorer, to discover potentially interesting biological pathways linking service models of biological processes. The tool uses an innovative approach to identify useful pathways based on graph-based hints and service-based simulation verifying user's hypotheses. Conclusion Web service modeling of biological processes allows the easy access and invocation of these processes on the Web. Web service mining techniques described in this paper enable the discovery of biological pathways linking these process service models. Algorithms presented in this paper for automatically highlighting interesting subgraph within an identified pathway network enable the user to formulate hypothesis, which can be tested out using our simulation algorithm that are also described in this paper. PMID:19796403
Graphic and haptic simulation for transvaginal cholecystectomy training in NOTES.
Pan, Jun J; Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Li, Bai C; Sankaranarayanan, Ganesh; Roberts, Kurt; Schwaitzberg, Steven; De, Suvranu
2016-04-01
Natural Orifice Transluminal Endoscopic Surgery (NOTES) provides an emerging surgical technique which usually needs a long learning curve for surgeons. Virtual reality (VR) medical simulators with vision and haptic feedback can usually offer an efficient and cost-effective alternative without risk to the traditional training approaches. Under this motivation, we developed the first virtual reality simulator for transvaginal cholecystectomy in NOTES (VTEST™). This VR-based surgical simulator aims to simulate the hybrid NOTES of cholecystectomy. We use a 6DOF haptic device and a tracking sensor to construct the core hardware component of simulator. For software, an innovative approach based on the inner-spheres is presented to deform the organs in real time. To handle the frequent collision between soft tissue and surgical instruments, an adaptive collision detection method based on GPU is designed and implemented. To give a realistic visual performance of gallbladder fat tissue removal by cautery hook, a multi-layer hexahedral model is presented to simulate the electric dissection of fat tissue. From the experimental results, trainees can operate in real time with high degree of stability and fidelity. A preliminary study was also performed to evaluate the realism and the usefulness of this hybrid NOTES simulator. This prototyped simulation system has been verified by surgeons through a pilot study. Some items of its visual performance and the utility were rated fairly high by the participants during testing. It exhibits the potential to improve the surgical skills of trainee and effectively shorten their learning curve. Copyright © 2016 Elsevier Inc. All rights reserved.
Physics-based simulation models for EBSD: advances and challenges
NASA Astrophysics Data System (ADS)
Winkelmann, A.; Nolze, G.; Vos, M.; Salvat-Pujol, F.; Werner, W. S. M.
2016-02-01
EBSD has evolved into an effective tool for microstructure investigations in the scanning electron microscope. The purpose of this contribution is to give an overview of various simulation approaches for EBSD Kikuchi patterns and to discuss some of the underlying physical mechanisms.
CUBE: Information-optimized parallel cosmological N-body simulation code
NASA Astrophysics Data System (ADS)
Yu, Hao-Ran; Pen, Ue-Li; Wang, Xin
2018-05-01
CUBE, written in Coarray Fortran, is a particle-mesh based parallel cosmological N-body simulation code. The memory usage of CUBE can approach as low as 6 bytes per particle. Particle pairwise (PP) force, cosmological neutrinos, spherical overdensity (SO) halofinder are included.
Simulation-based robust optimization for signal timing and setting.
DOT National Transportation Integrated Search
2009-12-30
The performance of signal timing plans obtained from traditional approaches for : pre-timed (fixed-time or actuated) control systems is often unstable under fluctuating traffic : conditions. This report develops a general approach for optimizing the ...
Dynamic simulation of a reverse Brayton refrigerator
NASA Astrophysics Data System (ADS)
Peng, N.; Lei, L. L.; Xiong, L. Y.; Tang, J. C.; Dong, B.; Liu, L. Q.
2014-01-01
A test refrigerator based on the modified Reverse Brayton cycle has been developed in the Chinese Academy of Sciences recently. To study the behaviors of this test refrigerator, a dynamic simulation has been carried out. The numerical model comprises the typical components of the test refrigerator: compressor, valves, heat exchangers, expander and heater. This simulator is based on the oriented-object approach and each component is represented by a set of differential and algebraic equations. The control system of the test refrigerator is also simulated, which can be used to optimize the control strategies. This paper describes all the models and shows the simulation results. Comparisons between simulation results and experimental data are also presented. Experimental validation on the test refrigerator gives satisfactory results.
Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries.
Shafiey, Hassan; Gan, Xinjun; Waxman, David
2017-11-01
To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.
Corrected simulations for one-dimensional diffusion processes with naturally occurring boundaries
NASA Astrophysics Data System (ADS)
Shafiey, Hassan; Gan, Xinjun; Waxman, David
2017-11-01
To simulate a diffusion process, a usual approach is to discretize the time in the associated stochastic differential equation. This is the approach used in the Euler method. In the present work we consider a one-dimensional diffusion process where the terms occurring, within the stochastic differential equation, prevent the process entering a region. The outcome is a naturally occurring boundary (which may be absorbing or reflecting). A complication occurs in a simulation of this situation. The term involving a random variable, within the discretized stochastic differential equation, may take a trajectory across the boundary into a "forbidden region." The naive way of dealing with this problem, which we refer to as the "standard" approach, is simply to reset the trajectory to the boundary, based on the argument that crossing the boundary actually signifies achieving the boundary. In this work we show, within the framework of the Euler method, that such resetting introduces a spurious force into the original diffusion process. This force may have a significant influence on trajectories that come close to a boundary. We propose a corrected numerical scheme, for simulating one-dimensional diffusion processes with naturally occurring boundaries. This involves correcting the standard approach, so that an exact property of the diffusion process is precisely respected. As a consequence, the proposed scheme does not introduce a spurious force into the dynamics. We present numerical test cases, based on exactly soluble one-dimensional problems with one or two boundaries, which suggest that, for a given value of the discrete time step, the proposed scheme leads to substantially more accurate results than the standard approach. Alternatively, the standard approach needs considerably more computation time to obtain a comparable level of accuracy to the proposed scheme, because the standard approach requires a significantly smaller time step.
Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier
2009-01-01
The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989
Okamoto, Takuma; Sakaguchi, Atsushi
2017-03-01
Generating acoustically bright and dark zones using loudspeakers is gaining attention as one of the most important acoustic communication techniques for such uses as personal sound systems and multilingual guide services. Although most conventional methods are based on numerical solutions, an analytical approach based on the spatial Fourier transform with a linear loudspeaker array has been proposed, and its effectiveness has been compared with conventional acoustic energy difference maximization and presented by computer simulations. To describe the effectiveness of the proposal in actual environments, this paper investigates the experimental validation of the proposed approach with rectangular and Hann windows and compared it with three conventional methods: simple delay-and-sum beamforming, contrast maximization, and least squares-based pressure matching using an actually implemented linear array of 64 loudspeakers in an anechoic chamber. The results of both the computer simulations and the actual experiments show that the proposed approach with a Hann window more accurately controlled the bright and dark zones than the conventional methods.
Haji, Faizal A; Da Silva, Celina; Daigle, Delton T; Dubrowski, Adam
2014-08-01
Presently, health care simulation research is largely conducted on a study-by-study basis. Although such "project-based" research generates a plethora of evidence, it can be chaotic and contradictory. A move toward sustained, thematic, theory-based programs of research is necessary to advance knowledge in the field. Recognizing that simulation is a complex intervention, we present a framework for developing research programs in simulation-based education adapted from the Medical Research Council (MRC) guidance. This framework calls for an iterative approach to developing, refining, evaluating, and implementing simulation interventions. The adapted framework guidance emphasizes: (1) identification of theory and existing evidence; (2) modeling and piloting interventions to clarify active ingredients and identify mechanisms linking the context, intervention, and outcomes; and (3) evaluation of intervention processes and outcomes in both the laboratory and real-world setting. The proposed framework will aid simulation researchers in developing more robust interventions that optimize simulation-based education and advance our understanding of simulation pedagogy.
A functional language approach in high-speed digital simulation
NASA Technical Reports Server (NTRS)
Ercegovac, M. D.; Lu, S.-L.
1983-01-01
A functional programming approach for a multi-microprocessor architecture is presented. The language, based on Backus FP, its intermediate form and the translation process are discussed and illustrated with an example. The approach allows performance analysis to be performed at a high level as an aid in program partitioning.
Topping, Anne; Bøje, Rikke Buus; Rekola, Leena; Hartvigsen, Tina; Prescott, Stephen; Bland, Andrew; Hope, Angela; Haho, Paivi; Hannula, Leena
2015-11-01
This paper presents the results of a systemised rapid review and synthesis of the literature undertaken to identify competencies required by nurse educators to facilitate simulation-based learning (SBL). An international collaboration undertook a protocol-based search, retrieval and critical review. Web of Science, PubMed, CINAHL Plus, PsycInfo, ERIC, the Cochrane Library and Science Direct. The search was limited to articles published in English, 2002-2012. The search terms used: nurse*, learn*, facilitator, simula*, lecturer, competence, skill*, qualificat*, educator, health care, "patient simulation", "nursing education" and "faculty". The search yielded 2156 "hits", following a review of the abstracts, 72 full-text articles were extracted. These were screened against predetermined inclusion/exclusion criteria and nine articles were retained. Following critical appraisal, the articles were analyzed using an inductive approach to extract statements for categorization and synthesis as competency statements. This review confirmed that there was a modest amount of empirical evidence on which to base a competency framework. Those papers that provided descriptions of educator preparation identified simulation-based workshops, or experiential training, as the most common approaches for enhancing skills. SBL was not associated with any one theoretical perspective. Delivery of SBL appeared to demand competencies associated with planning and designing simulations, facilitating learning in "safe" environments, expert nursing knowledge based on credible clinical realism, reference to evidence-based knowledge and demonstration of professional values and identity. This review derived a preliminary competency framework. This needs further development as a model for educators delivering SBL as part of nursing curricula. Copyright © 2015. Published by Elsevier Ltd.
Model-based surgical planning and simulation of cranial base surgery.
Abe, M; Tabuchi, K; Goto, M; Uchino, A
1998-11-01
Plastic skull models of seven individual patients were fabricated by stereolithography from three-dimensional data based on computed tomography bone images. Skull models were utilized for neurosurgical planning and simulation in the seven patients with cranial base lesions that were difficult to remove. Surgical approaches and areas of craniotomy were evaluated using the fabricated skull models. In preoperative simulations, hand-made models of the tumors, major vessels and nerves were placed in the skull models. Step-by-step simulation of surgical procedures was performed using actual surgical tools. The advantages of using skull models to plan and simulate cranial base surgery include a better understanding of anatomic relationships, preoperative evaluation of the proposed procedure, increased understanding by the patient and family, and improved educational experiences for residents and other medical staff. The disadvantages of using skull models include the time and cost of making the models. The skull models provide a more realistic tool that is easier to handle than computer-graphic images. Surgical simulation using models facilitates difficult cranial base surgery and may help reduce surgical complications.
NASA Astrophysics Data System (ADS)
Khodabakhshi, M.; Jafarpour, B.
2013-12-01
Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.
Traffic and Driving Simulator Based on Architecture of Interactive Motion.
Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza
2015-01-01
This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.
Traffic and Driving Simulator Based on Architecture of Interactive Motion
Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza
2015-01-01
This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711
NASA Astrophysics Data System (ADS)
Cui, Z.; Welty, C.; Maxwell, R. M.
2011-12-01
Lagrangian, particle-tracking models are commonly used to simulate solute advection and dispersion in aquifers. They are computationally efficient and suffer from much less numerical dispersion than grid-based techniques, especially in heterogeneous and advectively-dominated systems. Although particle-tracking models are capable of simulating geochemical reactions, these reactions are often simplified to first-order decay and/or linear, first-order kinetics. Nitrogen transport and transformation in aquifers involves both biodegradation and higher-order geochemical reactions. In order to take advantage of the particle-tracking approach, we have enhanced an existing particle-tracking code SLIM-FAST, to simulate nitrogen transport and transformation in aquifers. The approach we are taking is a hybrid one: the reactive multispecies transport process is operator split into two steps: (1) the physical movement of the particles including the attachment/detachment to solid surfaces, which is modeled by a Lagrangian random-walk algorithm; and (2) multispecies reactions including biodegradation are modeled by coupling multiple Monod equations with other geochemical reactions. The coupled reaction system is solved by an ordinary differential equation solver. In order to solve the coupled system of equations, after step 1, the particles are converted to grid-based concentrations based on the mass and position of the particles, and after step 2 the newly calculated concentration values are mapped back to particles. The enhanced particle-tracking code is capable of simulating subsurface nitrogen transport and transformation in a three-dimensional domain with variably saturated conditions. Potential application of the enhanced code is to simulate subsurface nitrogen loading to the Chesapeake Bay and its tributaries. Implementation details, verification results of the enhanced code with one-dimensional analytical solutions and other existing numerical models will be presented in addition to a discussion of implementation challenges.
Comparison of two head-up displays in simulated standard and noise abatement night visual approaches
NASA Technical Reports Server (NTRS)
Cronn, F.; Palmer, E. A., III
1975-01-01
Situation and command head-up displays were evaluated for both standard and two segment noise abatement night visual approaches in a fixed base simulation of a DC-8 transport aircraft. The situation display provided glide slope and pitch attitude information. The command display provided glide slope information and flight path commands to capture a 3 deg glide slope. Landing approaches were flown in both zero wind and wind shear conditions. For both standard and noise abatement approaches, the situation display provided greater glidepath accuracy in the initial phase of the landing approaches, whereas the command display was more effective in the final approach phase. Glidepath accuracy was greater for the standard approaches than for the noise abatement approaches in all phases of the landing approach. Most of the pilots preferred the command display and the standard approach. Substantial agreement was found between each pilot's judgment of his performance and his actual performance.
Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Zhang, He
2016-11-20
Angular velocity information is a requisite for a spacecraft guidance, navigation, and control system. In this paper, an approach for angular velocity estimation based merely on star vector measurement with an improved current statistical model Kalman filter is proposed. High-precision angular velocity estimation can be achieved under dynamic conditions. The amount of calculation is also reduced compared to a Kalman filter. Different trajectories are simulated to test this approach, and experiments with real starry sky observation are implemented for further confirmation. The estimation accuracy is proved to be better than 10-4 rad/s under various conditions. Both the simulation and the experiment demonstrate that the described approach is effective and shows an excellent performance under both static and dynamic conditions.
Real-time simulation of biological soft tissues: a PGD approach.
Niroomandi, S; González, D; Alfaro, I; Bordeu, F; Leygue, A; Cueto, E; Chinesta, F
2013-05-01
We introduce here a novel approach for the numerical simulation of nonlinear, hyperelastic soft tissues at kilohertz feedback rates necessary for haptic rendering. This approach is based upon the use of proper generalized decomposition techniques, a generalization of PODs. Proper generalized decomposition techniques can be considered as a means of a priori model order reduction and provides a physics-based meta-model without the need for prior computer experiments. The suggested strategy is thus composed of an offline phase, in which a general meta-model is computed, and an online evaluation phase in which the results are obtained at real time. Results are provided that show the potential of the proposed technique, together with some benchmark test that shows the accuracy of the method. Copyright © 2013 John Wiley & Sons, Ltd.
Conradsen, Isa; Beniczky, Sandor; Wolf, Peter; Terney, Daniella; Sams, Thomas; Sorensen, Helge B D
2009-01-01
Many epilepsy patients cannot call for help during a seizure, because they are unconscious or because of the affection of their motor system or speech function. This can lead to injuries, medical complications and at worst death. An alarm system setting off at seizure onset could help to avoid hazards. Today no reliable alarm systems are available. A Multi-modal Intelligent Seizure Acquisition (MISA) system based on full body motion data seems as a good approach towards detection of epileptic seizures. The system is the first to provide a full body description for epilepsy applications. Three test subjects were used for this pilot project. Each subject simulated 15 seizures and in addition performed some predefined normal activities, during a 4-hour monitoring with electromyography (EMG), accelerometer, magnetometer and gyroscope (AMG), electrocardiography (ECG), electroencephalography (EEG) and audio and video recording. The results showed that a non-subject specific MISA system developed on data from the modalities: accelerometer (ACM), gyroscope and EMG is able to detect 98% of the simulated seizures and at the same time mistakes only 4 of the normal movements for seizures. If the system is individualized (subject specific) it is able to detect all simulated seizures with a maximum of 1 false positive. Based on the results from the simulated seizures and normal movements the MISA system seems to be a promising approach to seizure detection.
NASA Astrophysics Data System (ADS)
Schöttl, Peter; Bern, Gregor; van Rooyen, De Wet; Heimsath, Anna; Fluri, Thomas; Nitz, Peter
2017-06-01
A transient simulation methodology for cavity receivers for Solar Tower Central Receiver Systems with molten salt as heat transfer fluid is described. Absorbed solar radiation is modeled with ray tracing and a sky discretization approach to reduce computational effort. Solar radiation re-distribution in the cavity as well as thermal radiation exchange are modeled based on view factors, which are also calculated with ray tracing. An analytical approach is used to represent convective heat transfer in the cavity. Heat transfer fluid flow is simulated with a discrete tube model, where the boundary conditions at the outer tube surface mainly depend on inputs from the previously mentioned modeling aspects. A specific focus is put on the integration of optical and thermo-hydraulic models. Furthermore, aiming point and control strategies are described, which are used during the transient performance assessment. Eventually, the developed simulation methodology is used for the optimization of the aperture opening size of a PS10-like reference scenario with cavity receiver and heliostat field. The objective function is based on the cumulative gain of one representative day. Results include optimized aperture opening size, transient receiver characteristics and benefits of the implemented aiming point strategy compared to a single aiming point approach. Future work will include annual simulations, cost assessment and optimization of a larger range of receiver parameters.
Toward economic flood loss characterization via hazard simulation
NASA Astrophysics Data System (ADS)
Czajkowski, Jeffrey; Cunha, Luciana K.; Michel-Kerjan, Erwann; Smith, James A.
2016-08-01
Among all natural disasters, floods have historically been the primary cause of human and economic losses around the world. Improving flood risk management requires a multi-scale characterization of the hazard and associated losses—the flood loss footprint. But this is typically not available in a precise and timely manner, yet. To overcome this challenge, we propose a novel and multidisciplinary approach which relies on a computationally efficient hydrological model that simulates streamflow for scales ranging from small creeks to large rivers. We adopt a normalized index, the flood peak ratio (FPR), to characterize flood magnitude across multiple spatial scales. The simulated FPR is then shown to be a key statistical driver for associated economic flood losses represented by the number of insurance claims. Importantly, because it is based on a simulation procedure that utilizes generally readily available physically-based data, our flood simulation approach has the potential to be broadly utilized, even for ungauged and poorly gauged basins, thus providing the necessary information for public and private sector actors to effectively reduce flood losses and save lives.
Peter, Emanuel K; Shea, Joan-Emma; Pivkin, Igor V
2016-05-14
In this paper, we present a coarse replica exchange molecular dynamics (REMD) approach, based on kinetic Monte Carlo (kMC). The new development significantly can reduce the amount of replicas and the computational cost needed to enhance sampling in protein simulations. We introduce 2 different methods which primarily differ in the exchange scheme between the parallel ensembles. We apply this approach on folding of 2 different β-stranded peptides: the C-terminal β-hairpin fragment of GB1 and TrpZip4. Additionally, we use the new simulation technique to study the folding of TrpCage, a small fast folding α-helical peptide. Subsequently, we apply the new methodology on conformation changes in signaling of the light-oxygen voltage (LOV) sensitive domain from Avena sativa (AsLOV2). Our results agree well with data reported in the literature. In simulations of dialanine, we compare the statistical sampling of the 2 techniques with conventional REMD and analyze their performance. The new techniques can reduce the computational cost of REMD significantly and can be used in enhanced sampling simulations of biomolecules.
García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime
2018-03-27
Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a) broadcast, (b) neighbor and (c) low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen's d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach.
2018-01-01
Smart communication protocols are becoming a key mechanism for improving communication performance in networks such as wireless sensor networks. However, the literature lacks mechanisms for simulating smart communication protocols in precision agriculture for decreasing production costs. In this context, the current work presents an agent-based simulator of smart communication protocols for efficiently managing pesticides. The simulator considers the needs of electric power, crop health, percentage of alive bugs and pesticide consumption. The current approach is illustrated with three different communication protocols respectively called (a) broadcast, (b) neighbor and (c) low-cost neighbor. The low-cost neighbor protocol obtained a statistically-significant reduction in the need of electric power over the neighbor protocol, with a very large difference according to the common interpretations about the Cohen’s d effect size. The presented simulator is called ABS-SmartComAgri and is freely distributed as open-source from a public research data repository. It ensures the reproducibility of experiments and allows other researchers to extend the current approach. PMID:29584703
Towards inverse modeling of turbidity currents: The inverse lock-exchange problem
NASA Astrophysics Data System (ADS)
Lesshafft, Lutz; Meiburg, Eckart; Kneller, Ben; Marsden, Alison
2011-04-01
A new approach is introduced for turbidite modeling, leveraging the potential of computational fluid dynamics methods to simulate the flow processes that led to turbidite formation. The practical use of numerical flow simulation for the purpose of turbidite modeling so far is hindered by the need to specify parameters and initial flow conditions that are a priori unknown. The present study proposes a method to determine optimal simulation parameters via an automated optimization process. An iterative procedure matches deposit predictions from successive flow simulations against available localized reference data, as in practice may be obtained from well logs, and aims at convergence towards the best-fit scenario. The final result is a prediction of the entire deposit thickness and local grain size distribution. The optimization strategy is based on a derivative-free, surrogate-based technique. Direct numerical simulations are performed to compute the flow dynamics. A proof of concept is successfully conducted for the simple test case of a two-dimensional lock-exchange turbidity current. The optimization approach is demonstrated to accurately retrieve the initial conditions used in a reference calculation.
Hierarchical Approach to 'Atomistic' 3-D MOSFET Simulation
NASA Technical Reports Server (NTRS)
Asenov, Asen; Brown, Andrew R.; Davies, John H.; Saini, Subhash
1999-01-01
We present a hierarchical approach to the 'atomistic' simulation of aggressively scaled sub-0.1 micron MOSFET's. These devices are so small that their characteristics depend on the precise location of dopant atoms within them, not just on their average density. A full-scale three-dimensional drift-diffusion atomistic simulation approach is first described and used to verify more economical, but restricted, options. To reduce processor time and memory requirements at high drain voltage, we have developed a self-consistent option based on a solution of the current continuity equation restricted to a thin slab of the channel. This is coupled to the solution of the Poisson equation in the whole simulation domain in the Gummel iteration cycles. The accuracy of this approach is investigated in comparison to the full self-consistent solution. At low drain voltage, a single solution of the nonlinear Poisson equation is sufficient to extract the current with satisfactory accuracy. In this case, the current is calculated by solving the current continuity equation in a drift approximation only, also in a thin slab containing the MOSFET channel. The regions of applicability for the different components of this hierarchical approach are illustrated in example simulations covering the random dopant-induced threshold voltage fluctuations, threshold voltage lowering, threshold voltage asymmetry, and drain current fluctuations.
Remote Learning for the Manipulation and Control of Robotic Cells
ERIC Educational Resources Information Center
Goldstain, Ofir; Ben-Gal, Irad; Bukchin, Yossi
2007-01-01
This work proposes an approach to remote learning of robotic cells based on internet and simulation tools. The proposed approach, which integrates remote-learning and tele-operation into a generic scheme, is designed to enable students and developers to set-up and manipulate a robotic cell remotely. Its implementation is based on a dedicated…
NASA Astrophysics Data System (ADS)
Leskiw, Donald M.; Zhau, Junmei
2000-06-01
This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.
2012-01-01
The fast and accurate computation of the electric forces that drive the motion of charged particles at the nanometer scale represents a computational challenge. For this kind of system, where the discrete nature of the charges cannot be neglected, boundary element methods (BEM) represent a better approach than finite differences/finite elements methods. In this article, we compare two different BEM approaches to a canonical electrostatic problem in a three-dimensional space with inhomogeneous dielectrics, emphasizing their suitability for particle-based simulations: the iterative method proposed by Hoyles et al. and the Induced Charge Computation introduced by Boda et al. PMID:22338640
Berti, Claudio; Gillespie, Dirk; Eisenberg, Robert S; Fiegna, Claudio
2012-02-16
The fast and accurate computation of the electric forces that drive the motion of charged particles at the nanometer scale represents a computational challenge. For this kind of system, where the discrete nature of the charges cannot be neglected, boundary element methods (BEM) represent a better approach than finite differences/finite elements methods. In this article, we compare two different BEM approaches to a canonical electrostatic problem in a three-dimensional space with inhomogeneous dielectrics, emphasizing their suitability for particle-based simulations: the iterative method proposed by Hoyles et al. and the Induced Charge Computation introduced by Boda et al.
NASA Technical Reports Server (NTRS)
Adeleye, Sanya; Chung, Christopher
2006-01-01
Commercial aircraft undergo a significant number of maintenance and logistical activities during the turnaround operation at the departure gate. By analyzing the sequencing of these activities, more effective turnaround contingency plans may be developed for logistical and maintenance disruptions. Turnaround contingency plans are particularly important as any kind of delay in a hub based system may cascade into further delays with subsequent connections. The contingency sequencing of the maintenance and logistical turnaround activities were analyzed using a combined network and computer simulation modeling approach. Experimental analysis of both current and alternative policies provides a framework to aid in more effective tactical decision making.
Schaefer, David R; Adams, Jimi; Haas, Steven A
2013-10-01
Adolescent smoking and friendship networks are related in many ways that can amplify smoking prevalence. Understanding and developing interventions within such a complex system requires new analytic approaches. We draw on recent advances in dynamic network modeling to develop a technique that explores the implications of various intervention strategies targeted toward micro-level processes. Our approach begins by estimating a stochastic actor-based model using data from one school in the National Longitudinal Study of Adolescent Health. The model provides estimates of several factors predicting friendship ties and smoking behavior. We then use estimated model parameters to simulate the coevolution of friendship and smoking behavior under potential intervention scenarios. Namely, we manipulate the strength of peer influence on smoking and the popularity of smokers relative to nonsmokers. We measure how these manipulations affect smoking prevalence, smoking initiation, and smoking cessation. Results indicate that both peer influence and smoking-based popularity affect smoking behavior and that their joint effects are nonlinear. This study demonstrates how a simulation-based approach can be used to explore alternative scenarios that may be achievable through intervention efforts and offers new hypotheses about the association between friendship and smoking.
Schaefer, David R.; adams, jimi; Haas, Steven A.
2015-01-01
Adolescent smoking and friendship networks are related in many ways that can amplify smoking prevalence. Understanding and developing interventions within such a complex system requires new analytic approaches. We draw upon recent advances in dynamic network modeling to develop a technique that explores the implications of various intervention strategies targeted toward micro-level processes. Our approach begins by estimating a stochastic actor-based model using data from one school in the National Longitudinal Study of Adolescent Health. The model provides estimates of several factors predicting friendship ties and smoking behavior. We then use estimated model parameters to simulate the co-evolution of friendship and smoking behavior under potential intervention scenarios. Namely, we manipulate the strength of peer influence on smoking and the popularity of smokers relative to nonsmokers. We measure how these manipulations affect smoking prevalence, smoking initiation, and smoking cessation. Results indicate that both peer influence and smoking-based popularity affect smoking behavior, and that their joint effects are nonlinear. This study demonstrates how a simulation-based approach can be used to explore alternative scenarios that may be achievable through intervention efforts and offers new hypotheses about the association between friendship and smoking. PMID:24084397
NASA Astrophysics Data System (ADS)
Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin
2012-08-01
Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.
Rupture Dynamics and Seismic Radiation on Rough Faults for Simulation-Based PSHA
NASA Astrophysics Data System (ADS)
Mai, P. M.; Galis, M.; Thingbaijam, K. K. S.; Vyas, J. C.; Dunham, E. M.
2017-12-01
Simulation-based ground-motion predictions may augment PSHA studies in data-poor regions or provide additional shaking estimations, incl. seismic waveforms, for critical facilities. Validation and calibration of such simulation approaches, based on observations and GMPE's, is important for engineering applications, while seismologists push to include the precise physics of the earthquake rupture process and seismic wave propagation in 3D heterogeneous Earth. Geological faults comprise both large-scale segmentation and small-scale roughness that determine the dynamics of the earthquake rupture process and its radiated seismic wavefield. We investigate how different parameterizations of fractal fault roughness affect the rupture evolution and resulting near-fault ground motions. Rupture incoherence induced by fault roughness generates realistic ω-2 decay for high-frequency displacement amplitude spectra. Waveform characteristics and GMPE-based comparisons corroborate that these rough-fault rupture simulations generate realistic synthetic seismogram for subsequent engineering application. Since dynamic rupture simulations are computationally expensive, we develop kinematic approximations that emulate the observed dynamics. Simplifying the rough-fault geometry, we find that perturbations in local moment tensor orientation are important, while perturbations in local source location are not. Thus, a planar fault can be assumed if the local strike, dip, and rake are maintained. The dynamic rake angle variations are anti-correlated with local dip angles. Based on a dynamically consistent Yoffe source-time function, we show that the seismic wavefield of the approximated kinematic rupture well reproduces the seismic radiation of the full dynamic source process. Our findings provide an innovative pseudo-dynamic source characterization that captures fault roughness effects on rupture dynamics. Including the correlations between kinematic source parameters, we present a new pseudo-dynamic rupture modeling approach for computing broadband ground-motion time-histories for simulation-based PSHA
King, Gillian; Shepherd, Tracy A; Servais, Michelle; Willoughby, Colleen; Bolack, Linda; Strachan, Deborah; Moodie, Sheila; Baldwin, Patricia; Knickle, Kerry; Parker, Kathryn; Savage, Diane; McNaughton, Nancy
2016-10-01
To describe the creation and validation of six simulations concerned with effective listening and interpersonal communication in pediatric rehabilitation. The simulations involved clinicians from various disciplines, were based on clinical scenarios related to client issues, and reflected core aspects of listening/communication. Each simulation had a key learning objective, thus focusing clinicians on specific listening skills. The article outlines the process used to turn written scenarios into digital video simulations, including steps taken to establish content validity and authenticity, and to establish a series of videos based on the complexity of their learning objectives, given contextual factors and associated macrocognitive processes that influence the ability to listen. A complexity rating scale was developed and used to establish a gradient of easy/simple, intermediate, and hard/complex simulations. The development process exemplifies an evidence-based, integrated knowledge translation approach to the teaching and learning of listening and communication skills.
Investigation of Asymmetric Thrust Detection with Demonstration in a Real-Time Simulation Testbed
NASA Technical Reports Server (NTRS)
Chicatelli, Amy; Rinehart, Aidan W.; Sowers, T. Shane; Simon, Donald L.
2015-01-01
The purpose of this effort is to develop, demonstrate, and evaluate three asymmetric thrust detection approaches to aid in the reduction of asymmetric thrust-induced aviation accidents. This paper presents the results from that effort and their evaluation in simulation studies, including those from a real-time flight simulation testbed. Asymmetric thrust is recognized as a contributing factor in several Propulsion System Malfunction plus Inappropriate Crew Response (PSM+ICR) aviation accidents. As an improvement over the state-of-the-art, providing annunciation of asymmetric thrust to alert the crew may hold safety benefits. For this, the reliable detection and confirmation of asymmetric thrust conditions is required. For this work, three asymmetric thrust detection methods are presented along with their results obtained through simulation studies. Representative asymmetric thrust conditions are modeled in simulation based on failure scenarios similar to those reported in aviation incident and accident descriptions. These simulated asymmetric thrust scenarios, combined with actual aircraft operational flight data, are then used to conduct a sensitivity study regarding the detection capabilities of the three methods. Additional evaluation results are presented based on pilot-in-the-loop simulation studies conducted in the NASA Glenn Research Center (GRC) flight simulation testbed. Data obtained from this flight simulation facility are used to further evaluate the effectiveness and accuracy of the asymmetric thrust detection approaches. Generally, the asymmetric thrust conditions are correctly detected and confirmed.
Investigation of Asymmetric Thrust Detection with Demonstration in a Real-Time Simulation Testbed
NASA Technical Reports Server (NTRS)
Chicatelli, Amy K.; Rinehart, Aidan W.; Sowers, T. Shane; Simon, Donald L.
2016-01-01
The purpose of this effort is to develop, demonstrate, and evaluate three asymmetric thrust detection approaches to aid in the reduction of asymmetric thrust-induced aviation accidents. This paper presents the results from that effort and their evaluation in simulation studies, including those from a real-time flight simulation testbed. Asymmetric thrust is recognized as a contributing factor in several Propulsion System Malfunction plus Inappropriate Crew Response (PSM+ICR) aviation accidents. As an improvement over the state-of-the-art, providing annunciation of asymmetric thrust to alert the crew may hold safety benefits. For this, the reliable detection and confirmation of asymmetric thrust conditions is required. For this work, three asymmetric thrust detection methods are presented along with their results obtained through simulation studies. Representative asymmetric thrust conditions are modeled in simulation based on failure scenarios similar to those reported in aviation incident and accident descriptions. These simulated asymmetric thrust scenarios, combined with actual aircraft operational flight data, are then used to conduct a sensitivity study regarding the detection capabilities of the three methods. Additional evaluation results are presented based on pilot-in-the-loop simulation studies conducted in the NASA Glenn Research Center (GRC) flight simulation testbed. Data obtained from this flight simulation facility are used to further evaluate the effectiveness and accuracy of the asymmetric thrust detection approaches. Generally, the asymmetric thrust conditions are correctly detected and confirmed.
ERIC Educational Resources Information Center
Rooney, Donna; Hopwood, Nick; Boud, David; Kelly, Michelle
2015-01-01
The preparation of future professionals for practice is a key focus of higher education institutions. Among a range of approaches is the use of simulation pedagogies. While simulation is often justified as a direct bridge between higher education and professional practice, this paper questions this easy assumption. It develops a conceptually…
System Engineering Approach to Assessing Integrated Survivability
2009-08-01
based response for the above engagements using LS- Dyna for blast modelling, MADYMO for safety and human response, CFD software (Fluent) is used to...Simulation JFAS Joint Force Analysis Simulation JANUS Joint Army Navy Uniform Simulation LS- DYNA Livermore Software-Dynamics MADYMO...management technologies. The “don’t be killed” layer of survivability protection accounts for many of the mitigation technologies (i.e. blast
Potter, Julie Elizabeth; Gatward, Jonathan J; Kelly, Michelle A; McKay, Leigh; McCann, Ellie; Elliott, Rosalind M; Perry, Lin
2017-12-01
The approach, communication skills, and confidence of clinicians responsible for raising deceased organ donation may influence families' donation decisions. The aim of this study was to increase the preparedness and confidence of intensive care clinicians allocated to work in a "designated requester" role. We conducted a posttest evaluation of an innovative simulation-based training program. Simulation-based training enabled clinicians to rehearse the "balanced approach" to family donation conversations (FDCs) in the designated requester role. Professional actors played family members in simulated clinical settings using authentic scenarios, with video-assisted reflective debriefing. Participants completed an evaluation after the workshop. Simple descriptive statistical analysis and content analysis were performed. Between January 2013 and July 2015, 25 workshops were undertaken with 86 participants; 82 (95.3%) returned evaluations. Respondents were registered practicing clinicians; over half (44/82; 53.7%) were intensivists. Most attended a single workshop. Evaluations were overwhelmingly positive with the majority rating workshops as outstanding (64/80; 80%). Scenario fidelity, competence of the actors, opportunity to practice and receive feedback on performance, and feedback from actors, both in and out of character, were particularly valued. Most (76/78; 97.4%) reported feeling more confident about their designated requester role. Simulation-based communication training for the designated requester role in FDCs increased the knowledge and confidence of clinicians to raise the topic of donation.
Ren, Li-Hong; Ding, Yong-Sheng; Shen, Yi-Zhen; Zhang, Xiang-Feng
2008-10-01
Recently, a collective effort from multiple research areas has been made to understand biological systems at the system level. This research requires the ability to simulate particular biological systems as cells, organs, organisms, and communities. In this paper, a novel bio-network simulation platform is proposed for system biology studies by combining agent approaches. We consider a biological system as a set of active computational components interacting with each other and with an external environment. Then, we propose a bio-network platform for simulating the behaviors of biological systems and modelling them in terms of bio-entities and society-entities. As a demonstration, we discuss how a protein-protein interaction (PPI) network can be seen as a society of autonomous interactive components. From interactions among small PPI networks, a large PPI network can emerge that has a remarkable ability to accomplish a complex function or task. We also simulate the evolution of the PPI networks by using the bio-operators of the bio-entities. Based on the proposed approach, various simulators with different functions can be embedded in the simulation platform, and further research can be done from design to development, including complexity validation of the biological system.
Real-time simulation of contact and cutting of heterogeneous soft-tissues.
Courtecuisse, Hadrien; Allard, Jérémie; Kerfriden, Pierre; Bordas, Stéphane P A; Cotin, Stéphane; Duriez, Christian
2014-02-01
This paper presents a numerical method for interactive (real-time) simulations, which considerably improves the accuracy of the response of heterogeneous soft-tissue models undergoing contact, cutting and other topological changes. We provide an integrated methodology able to deal both with the ill-conditioning issues associated with material heterogeneities, contact boundary conditions which are one of the main sources of inaccuracies, and cutting which is one of the most challenging issues in interactive simulations. Our approach is based on an implicit time integration of a non-linear finite element model. To enable real-time computations, we propose a new preconditioning technique, based on an asynchronous update at low frequency. The preconditioner is not only used to improve the computation of the deformation of the tissues, but also to simulate the contact response of homogeneous and heterogeneous bodies with the same accuracy. We also address the problem of cutting the heterogeneous structures and propose a method to update the preconditioner according to the topological modifications. Finally, we apply our approach to three challenging demonstrators: (i) a simulation of cataract surgery (ii) a simulation of laparoscopic hepatectomy (iii) a brain tumor surgery. Copyright © 2013 Elsevier B.V. All rights reserved.
Wen, Tingxi; Medveczky, David; Wu, Jackie; Wu, Jianhuang
2018-01-25
Colonoscopy plays an important role in the clinical screening and management of colorectal cancer. The traditional 'see one, do one, teach one' training style for such invasive procedure is resource intensive and ineffective. Given that colonoscopy is difficult, and time-consuming to master, the use of virtual reality simulators to train gastroenterologists in colonoscopy operations offers a promising alternative. In this paper, a realistic and real-time interactive simulator for training colonoscopy procedure is presented, which can even include polypectomy simulation. Our approach models the colonoscopy as thick flexible elastic rods with different resolutions which are dynamically adaptive to the curvature of the colon. More material characteristics of this deformable material are integrated into our discrete model to realistically simulate the behavior of the colonoscope. We present a simulator for training colonoscopy procedure. In addition, we propose a set of key aspects of our simulator that give fast, high fidelity feedback to trainees. We also conducted an initial validation of this colonoscopic simulator to determine its clinical utility and efficacy.
2016-09-07
approach in co simulation with fluid-dynamics solvers is used. An original variational formulation is developed for the inverse problem of...by the inverse solution meshing. The same approach is used to map the structural and fluid interface kinematics and loads during the fluid structure...co-simulation. The inverse analysis is verified by reconstructing the deformed solution obtained with a corresponding direct formulation, based on
NASA Astrophysics Data System (ADS)
Eriksen, Trygve E.; Shoesmith, David W.; Jonsson, Mats
2012-01-01
Radiation induced dissolution of uranium dioxide (UO 2) nuclear fuel and the consequent release of radionuclides to intruding groundwater are key-processes in the safety analysis of future deep geological repositories for spent nuclear fuel. For several decades, these processes have been studied experimentally using both spent fuel and various types of simulated spent fuels. The latter have been employed since it is difficult to draw mechanistic conclusions from real spent nuclear fuel experiments. Several predictive modelling approaches have been developed over the last two decades. These models are largely based on experimental observations. In this work we have performed a critical review of the modelling approaches developed based on the large body of chemical and electrochemical experimental data. The main conclusions are: (1) the use of measured interfacial rate constants give results in generally good agreement with experimental results compared to simulations where homogeneous rate constants are used; (2) the use of spatial dose rate distributions is particularly important when simulating the behaviour over short time periods; and (3) the steady-state approach (the rate of oxidant consumption is equal to the rate of oxidant production) provides a simple but fairly accurate alternative, but errors in the reaction mechanism and in the kinetic parameters used may not be revealed by simple benchmarking. It is essential to use experimentally determined rate constants and verified reaction mechanisms, irrespective of whether the approach is chemical or electrochemical.
Single pilot scanning behavior in simulated instrument flight
NASA Technical Reports Server (NTRS)
Pennington, J. E.
1979-01-01
A simulation of tasks associated with single pilot general aviation flight under instrument flight rules was conducted as a baseline for future research studies on advanced flight controls and avionics. The tasks, ranging from simple climbs and turns to an instrument landing systems approach, were flown on a fixed base simulator. During the simulation the control inputs, state variables, and the pilots visual scan pattern including point of regard were measured and recorded.
Shear Elasticity and Shear Viscosity Imaging in Soft Tissue
NASA Astrophysics Data System (ADS)
Yang, Yiqun
In this thesis, a new approach is introduced that provides estimates of shear elasticity and shear viscosity using time-domain measurements of shear waves in viscoelastic media. Simulations of shear wave particle displacements induced by an acoustic radiation force are accelerated significantly by a GPU. The acoustic radiation force is first calculated using the fast near field method (FNM) and the angular spectrum approach (ASA). The shear waves induced by the acoustic radiation force are then simulated in elastic and viscoelastic media using Green's functions. A parallel algorithm is developed to perform these calculations on a GPU, where the shear wave particle displacements at different observation points are calculated in parallel. The resulting speed increase enables rapid evaluation of shear waves at discrete points, in 2D planes, and for push beams with different spatial samplings and for different values of the f-number (f/#). The results of these simulations show that push beams with smaller f/# require a higher spatial sampling rate. The significant amount of acceleration achieved by this approach suggests that shear wave simulations with the Green's function approach are ideally suited for high-performance GPUs. Shear wave elasticity imaging determines the mechanical parameters of soft tissue by analyzing measured shear waves induced by an acoustic radiation force. To estimate the shear elasticity value, the widely used time-of-flight method calculates the correlation between shear wave particle velocities at adjacent lateral observation points. Although this method provides accurate estimates of the shear elasticity in purely elastic media, our experience suggests that the time-of-flight (TOF) method consistently overestimates the shear elasticity values in viscoelastic media because the combined effects of diffraction, attenuation, and dispersion are not considered. To address this problem, we have developed an approach that directly accounts for all of these effects when estimating the shear elasticity. This new approach simulates shear wave particle velocities using a Green's function-based approach for the Voigt model, where the shear elasticity and viscosity values are estimated using an optimization-based approach that compares measured shear wave particle velocities with simulated shear wave particle velocities in the time-domain. The results are evaluated on a point-by-point basis to generate images. There is good agreement between the simulated and measured shear wave particle velocities, where the new approach yields much better images of the shear elasticity and shear viscosity than the TOF method. The new estimation approach is accelerated with an approximate viscoelastic Green's function model that is evaluated with shear wave data obtained from in vivo human livers. Instead of calculating shear waves with combinations of different shear elasticities and shear viscosities, shear waves are calculated with different shear elasticities on the GPU and then convolved with a viscous loss model, which accelerates the calculation dramatically. The shear elasticity and shear viscosity values are then estimated using an optimization-based approach by minimizing the difference between measured and simulated shear wave particle velocities. Shear elasticity and shear viscosity images are generated at every spatial point in a two-dimensional (2D) field-of-view (FOV). The new approach is applied to measured shear wave data obtained from in vivo human livers, and the results show that this new approach successfully generates shear elasticity and shear viscosity images from this data. The results also indicate that the shear elasticity values estimated with this approach are significantly smaller than the values estimated with the conventional TOF method and that the new approach demonstrates more consistent values for these estimates compared with the TOF method. This experience suggests that the new method is an effective approach for estimating the shear elasticity and the shear viscosity in liver and in other soft tissue.
THE VALUE OF NUDGING IN THE METEOROLOGY MODEL FOR RETROSPECTIVE CMAQ SIMULATIONS
Using a nudging-based data assimilation approach throughout a meteorology simulation (i.e., as a "dynamic analysis") is considered valuable because it can provide a better overall representation of the meteorology than a pure forecast. Dynamic analysis is often used in...
Introducing sampling entropy in repository based adaptive umbrella sampling
NASA Astrophysics Data System (ADS)
Zheng, Han; Zhang, Yingkai
2009-12-01
Determining free energy surfaces along chosen reaction coordinates is a common and important task in simulating complex systems. Due to the complexity of energy landscapes and the existence of high barriers, one widely pursued objective to develop efficient simulation methods is to achieve uniform sampling among thermodynamic states of interest. In this work, we have demonstrated sampling entropy (SE) as an excellent indicator for uniform sampling as well as for the convergence of free energy simulations. By introducing SE and the concentration theorem into the biasing-potential-updating scheme, we have further improved the adaptivity, robustness, and applicability of our recently developed repository based adaptive umbrella sampling (RBAUS) approach [H. Zheng and Y. Zhang, J. Chem. Phys. 128, 204106 (2008)]. Besides simulations of one dimensional free energy profiles for various systems, the generality and efficiency of this new RBAUS-SE approach have been further demonstrated by determining two dimensional free energy surfaces for the alanine dipeptide in gas phase as well as in water.
Xu, Rosalind J; Blasiak, Bartosz; Cho, Minhaeng; Layfield, Joshua P; Londergan, Casey H
2018-05-17
A quantitative connection between molecular dynamics simulations and vibrational spectroscopy of probe-labeled systems would enable direct translation of experimental data into structural and dynamical information. To constitute this connection, all-atom molecular dynamics (MD) simulations were performed for two SCN probe sites (solvent-exposed and buried) in a calmodulin-target peptide complex. Two frequency calculation approaches with substantial nonelectrostatic components, a quantum mechanics/molecular mechanics (QM/MM)-based technique and a solvatochromic fragment potential (SolEFP) approach, were used to simulate the infrared probe line shapes. While QM/MM results disagreed with experiment, SolEFP results matched experimental frequencies and line shapes and revealed the physical and dynamic bases for the observed spectroscopic behavior. The main determinant of the CN probe frequency is the exchange repulsion between the probe and its local structural neighbors, and there is a clear dynamic explanation for the relatively broad probe line shape observed at the "buried" probe site. This methodology should be widely applicable to vibrational probes in many environments.
NASA Technical Reports Server (NTRS)
Miller, G. K., Jr.
1981-01-01
The effect of reduced control authority, both in symmetric spoiler travel and thrust level, on the effectiveness of a decoupled longitudinal control system was examined during the approach and landing of the NASA terminal configured vehicle (TCV) aft flight deck simulator in the presence of wind shear. The evaluation was conducted in a fixed-base simulator that represented the TCV aft cockpit. There were no statistically significant effects of reduced spoiler and thrust authority on pilot performance during approach and landing. Increased wind severity degraded approach and landing performance by an amount that was often significant. However, every attempted landing was completed safely regardless of the wind severity. There were statistically significant differences in performance between subjects, but the differences were generally restricted to the control wheel and control-column activity during the approach.
Wiśniowska, Barbara; Polak, Sebastian
2016-11-01
A Quantitative Systems Pharmacology approach was utilized to predict the cardiac consequences of drug-drug interaction (DDI) at the population level. The Simcyp in vitro-in vivo correlation and physiologically based pharmacokinetic platform was used to predict the pharmacokinetic profile of terfenadine following co-administration of the drug. Electrophysiological effects were simulated using the Cardiac Safety Simulator. The modulation of ion channel activity was dependent on the inhibitory potential of drugs on the main cardiac ion channels and a simulated free heart tissue concentration. ten Tusscher's human ventricular cardiomyocyte model was used to simulate the pseudo-ECG traces and further predict the pharmacodynamic consequences of DDI. Consistent with clinical observations, predicted plasma concentration profiles of terfenadine show considerable intra-subject variability with recorded C max values below 5 ng/mL for most virtual subjects. The pharmacokinetic and pharmacodynamic effects of inhibitors were predicted with reasonable accuracy. In all cases, a combination of the physiologically based pharmacokinetic and physiology-based pharmacodynamic models was able to differentiate between the terfenadine alone and terfenadine + inhibitor scenario. The range of QT prolongation was comparable in the clinical and virtual studies. The results indicate that mechanistic in vitro-in vivo correlation can be applied to predict the clinical effects of DDI even without comprehensive knowledge on all mechanisms contributing to the interaction. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sawyer, Darren Charles
1994-01-01
The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.
Analog track angle error displays improve simulated GPS approach performance
DOT National Transportation Integrated Search
1996-01-01
Pilots flying non-precision instrument approaches traditionally rely on a course deviation indicator (CDI) analog display of cross track error (XTE) information. THe new generation of GPS based area navigation (RNAV) receivers can also compute accura...
Assessing risk-adjustment approaches under non-random selection.
Luft, Harold S; Dudley, R Adams
2004-01-01
Various approaches have been proposed to adjust for differences in enrollee risk in health plans. Because risk-selection strategies may have different effects on enrollment, we simulated three types of selection--dumping, skimming, and stinting. Concurrent diagnosis-based risk adjustment, and a hybrid using concurrent adjustment for about 8% of the cases and prospective adjustment for the rest, perform markedly better than prospective or demographic adjustments, both in terms of R2 and the extent to which plans experience unwarranted gains or losses. The simulation approach offers a valuable tool for analysts in assessing various risk-adjustment strategies under different selection situations.
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
Equilibrium simulations of proteins using molecular fragment replacement and NMR chemical shifts.
Boomsma, Wouter; Tian, Pengfei; Frellsen, Jes; Ferkinghoff-Borg, Jesper; Hamelryck, Thomas; Lindorff-Larsen, Kresten; Vendruscolo, Michele
2014-09-23
Methods of protein structure determination based on NMR chemical shifts are becoming increasingly common. The most widely used approaches adopt the molecular fragment replacement strategy, in which structural fragments are repeatedly reassembled into different complete conformations in molecular simulations. Although these approaches are effective in generating individual structures consistent with the chemical shift data, they do not enable the sampling of the conformational space of proteins with correct statistical weights. Here, we present a method of molecular fragment replacement that makes it possible to perform equilibrium simulations of proteins, and hence to determine their free energy landscapes. This strategy is based on the encoding of the chemical shift information in a probabilistic model in Markov chain Monte Carlo simulations. First, we demonstrate that with this approach it is possible to fold proteins to their native states starting from extended structures. Second, we show that the method satisfies the detailed balance condition and hence it can be used to carry out an equilibrium sampling from the Boltzmann distribution corresponding to the force field used in the simulations. Third, by comparing the results of simulations carried out with and without chemical shift restraints we describe quantitatively the effects that these restraints have on the free energy landscapes of proteins. Taken together, these results demonstrate that the molecular fragment replacement strategy can be used in combination with chemical shift information to characterize not only the native structures of proteins but also their conformational fluctuations.
NASA Astrophysics Data System (ADS)
Karimi, Hossein; Nikmehr, Saeid; Khodapanah, Ehsan
2016-09-01
In this paper, we develop a B-spline finite-element method (FEM) based on a locally modal wave propagation with anisotropic perfectly matched layers (PMLs), for the first time, to simulate nonlinear and lossy plasmonic waveguides. Conventional approaches like beam propagation method, inherently omit the wave spectrum and do not provide physical insight into nonlinear modes especially in the plasmonic applications, where nonlinear modes are constructed by linear modes with very close propagation constant quantities. Our locally modal B-spline finite element method (LMBS-FEM) does not suffer from the weakness of the conventional approaches. To validate our method, first, propagation of wave for various kinds of linear, nonlinear, lossless and lossy materials of metal-insulator plasmonic structures are simulated using LMBS-FEM in MATLAB and the comparisons are made with FEM-BPM module of COMSOL Multiphysics simulator and B-spline finite-element finite-difference wide angle beam propagation method (BSFEFD-WABPM). The comparisons show that not only our developed numerical approach is computationally more accurate and efficient than conventional approaches but also it provides physical insight into the nonlinear nature of the propagation modes.
Bivariate discrete beta Kernel graduation of mortality data.
Mazza, Angelo; Punzo, Antonio
2015-07-01
Various parametric/nonparametric techniques have been proposed in literature to graduate mortality data as a function of age. Nonparametric approaches, as for example kernel smoothing regression, are often preferred because they do not assume any particular mortality law. Among the existing kernel smoothing approaches, the recently proposed (univariate) discrete beta kernel smoother has been shown to provide some benefits. Bivariate graduation, over age and calendar years or durations, is common practice in demography and actuarial sciences. In this paper, we generalize the discrete beta kernel smoother to the bivariate case, and we introduce an adaptive bandwidth variant that may provide additional benefits when data on exposures to the risk of death are available; furthermore, we outline a cross-validation procedure for bandwidths selection. Using simulations studies, we compare the bivariate approach proposed here with its corresponding univariate formulation and with two popular nonparametric bivariate graduation techniques, based on Epanechnikov kernels and on P-splines. To make simulations realistic, a bivariate dataset, based on probabilities of dying recorded for the US males, is used. Simulations have confirmed the gain in performance of the new bivariate approach with respect to both the univariate and the bivariate competitors.
Calibrating random forests for probability estimation.
Dankowski, Theresa; Ziegler, Andreas
2016-09-30
Probabilities can be consistently estimated using random forests. It is, however, unclear how random forests should be updated to make predictions for other centers or at different time points. In this work, we present two approaches for updating random forests for probability estimation. The first method has been proposed by Elkan and may be used for updating any machine learning approach yielding consistent probabilities, so-called probability machines. The second approach is a new strategy specifically developed for random forests. Using the terminal nodes, which represent conditional probabilities, the random forest is first translated to logistic regression models. These are, in turn, used for re-calibration. The two updating strategies were compared in a simulation study and are illustrated with data from the German Stroke Study Collaboration. In most simulation scenarios, both methods led to similar improvements. In the simulation scenario in which the stricter assumptions of Elkan's method were not met, the logistic regression-based re-calibration approach for random forests outperformed Elkan's method. It also performed better on the stroke data than Elkan's method. The strength of Elkan's method is its general applicability to any probability machine. However, if the strict assumptions underlying this approach are not met, the logistic regression-based approach is preferable for updating random forests for probability estimation. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
A simulator evaluation of an automatic terminal approach system
NASA Technical Reports Server (NTRS)
Hinton, D. A.
1983-01-01
The automatic terminal approach system (ATAS) is a concept for improving the pilot/machine interface with cockpit automation. The ATAS can automatically fly a published instrument approach by using stored instrument approach data to automatically tune airplane avionics, control the airplane's autopilot, and display status information to the pilot. A piloted simulation study was conducted to determine the feasibility of an ATAS, determine pilot acceptance, and examine pilot/ATAS interaction. Seven instrument-rated pilots each flew four instrument approaches with a base-line heading select autopilot mode. The ATAS runs resulted in lower flight technical error, lower pilot workload, and fewer blunders than with the baseline autopilot. The ATAS status display enabled the pilots to maintain situational awareness during the automatic approaches. The system was well accepted by the pilots.
Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D
2018-05-18
Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has < 60% power to detect a reduction of 1 kg weight gain for a 10-unit increase in BMI. Additional IPD from ten other published trials (containing 1761 patients) would improve power to over 80%, but only if a fixed-effect meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power calculations could inform the planning and funding of IPD projects, and should be used routinely.
Virtual reality-based simulation training for ventriculostomy: an evidence-based approach.
Schirmer, Clemens M; Elder, J Bradley; Roitberg, Ben; Lobel, Darlene A
2013-10-01
Virtual reality (VR) simulation-based technologies play an important role in neurosurgical resident training. The Congress of Neurological Surgeons (CNS) Simulation Committee developed a simulation-based curriculum incorporating VR simulators to train residents in the management of common neurosurgical disorders. To enhance neurosurgical resident training for ventriculostomy placement using simulation-based training. A course-based neurosurgical simulation curriculum was introduced at the Neurosurgical Simulation Symposium at the 2011 and 2012 CNS annual meetings. A trauma module was developed to teach ventriculostomy placement as one of the neurosurgical procedures commonly performed in the management of traumatic brain injury. The course offered both didactic and simulator-based instruction, incorporating written and practical pretests and posttests and questionnaires to assess improvement in skill level and to validate the simulators as teaching tools. Fourteen trainees participated in the didactic component of the trauma module. Written scores improved significantly from pretest (75%) to posttest (87.5%; P < .05). Seven participants completed the ventriculostomy simulation. Significant improvements were observed in anatomy (P < .04), burr hole placement (P < .03), final location of the catheter (P = .05), and procedure completion time (P < .004). Senior residents planned a significantly better trajectory (P < .01); junior participants improved most in terms of identifying the relevant anatomy (P < .03) and the time required to complete the procedure (P < .04). VR ventriculostomy placement as part of the CNS simulation trauma module complements standard training techniques for residents in the management of neurosurgical trauma. Improvement in didactic and hands-on knowledge by course participants demonstrates the usefulness of the VR simulator as a training tool.
NASA Technical Reports Server (NTRS)
Grantham, W. D.; Smith, P. M.; Neely, W. R., Jr.; Deal, P. L.; Yenni, K. R.
1985-01-01
Six-degree-of-freedom ground-based and in-flight simulator studies were conducted to evaluate the low-speed flight characteristics of a twin-fuselage passenger transport airplane and to compare these characteristics with those of a large, single-fuselage (reference) transport configuration similar to the Lockheed C-5A airplane. The primary piloting task was the approach and landing task. The results of this study indicated that the twin-fuselage transport concept had acceptable but unsatisfactory longitudinal and lateral-directional low-speed flight characteristics, and that stability and control augmentation would be required in order to improve the handling qualities. Through the use of rate-command/attitude-hold augmentation in the pitch and roll axes, and the use of several turn coordination features, the handling qualities of the simulated transport were improved appreciably. The in-flight test results showed excellent agreement with those of the six-degree-of-freedom ground-based simulator handling qualities tests. As a result of the in-flight simulation study, a roll-control-induced normal-acceleration criterion was developed. The handling qualities of the augmented twin-fuselage passenger transport airplane exhibited an improvement over the handling characteristics of the reference (single-fuselage) transport.
NASA Astrophysics Data System (ADS)
Erkyihun, Solomon Tassew; Rajagopalan, Balaji; Zagona, Edith; Lall, Upmanu; Nowak, Kenneth
2016-05-01
A model to generate stochastic streamflow projections conditioned on quasi-oscillatory climate indices such as Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO) is presented. Recognizing that each climate index has underlying band-limited components that contribute most of the energy of the signals, we first pursue a wavelet decomposition of the signals to identify and reconstruct these features from annually resolved historical data and proxy based paleoreconstructions of each climate index covering the period from 1650 to 2012. A K-Nearest Neighbor block bootstrap approach is then developed to simulate the total signal of each of these climate index series while preserving its time-frequency structure and marginal distributions. Finally, given the simulated climate signal time series, a K-Nearest Neighbor bootstrap is used to simulate annual streamflow series conditional on the joint state space defined by the simulated climate index for each year. We demonstrate this method by applying it to simulation of streamflow at Lees Ferry gauge on the Colorado River using indices of two large scale climate forcings: Pacific Decadal Oscillation (PDO) and Atlantic Multi-decadal Oscillation (AMO), which are known to modulate the Colorado River Basin (CRB) hydrology at multidecadal time scales. Skill in stochastic simulation of multidecadal projections of flow using this approach is demonstrated.
Spatial Evaluation and Verification of Earthquake Simulators
NASA Astrophysics Data System (ADS)
Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.
2017-06-01
In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.
A manifold learning approach to data-driven computational materials and processes
NASA Astrophysics Data System (ADS)
Ibañez, Ruben; Abisset-Chavanne, Emmanuelle; Aguado, Jose Vicente; Gonzalez, David; Cueto, Elias; Duval, Jean Louis; Chinesta, Francisco
2017-10-01
Standard simulation in classical mechanics is based on the use of two very different types of equations. The first one, of axiomatic character, is related to balance laws (momentum, mass, energy, …), whereas the second one consists of models that scientists have extracted from collected, natural or synthetic data. In this work we propose a new method, able to directly link data to computers in order to perform numerical simulations. These simulations will employ universal laws while minimizing the need of explicit, often phenomenological, models. They are based on manifold learning methodologies.