Investigation of Propagation in Foliage Using Simulation Techniques
2011-12-01
simulation models provide a rough approximation to radiowave propagation in an actual rainforest environment. Based on the simulated results, the...simulation models provide a rough approximation to radiowave propagation in an actual rainforest environment. Based on the simulated results, the path... Rainforest ...............................2 2. Electrical Properties of a Forest .........................................................3 B. OBJECTIVES OF
Conducting Simulation Studies in the R Programming Environment.
Hallgren, Kevin A
2013-10-12
Simulation studies allow researchers to answer specific questions about data analysis, statistical power, and best-practices for obtaining accurate results in empirical research. Despite the benefits that simulation research can provide, many researchers are unfamiliar with available tools for conducting their own simulation studies. The use of simulation studies need not be restricted to researchers with advanced skills in statistics and computer programming, and such methods can be implemented by researchers with a variety of abilities and interests. The present paper provides an introduction to methods used for running simulation studies using the R statistical programming environment and is written for individuals with minimal experience running simulation studies or using R. The paper describes the rationale and benefits of using simulations and introduces R functions relevant for many simulation studies. Three examples illustrate different applications for simulation studies, including (a) the use of simulations to answer a novel question about statistical analysis, (b) the use of simulations to estimate statistical power, and (c) the use of simulations to obtain confidence intervals of parameter estimates through bootstrapping. Results and fully annotated syntax from these examples are provided.
Ventilator caregiver education through the use of high-fidelity pediatric simulators: a pilot study.
Tofil, Nancy M; Rutledge, Chrystal; Zinkan, J Lynn; Youngblood, Amber Q; Stone, Julie; Peterson, Dawn Taylor; Slayton, Donna; Makris, Chris; Magruder, Terri; White, Marjorie Lee
2013-11-01
Introduction. Home ventilator programs (HVP) have been developed to train parents of critically ill children. Simulators are used in health care, but not often for parents. We added simulation to our HVP and assessed parents' response. Methods. In July 2008, the HVP at Children's of Alabama added simulation to parent training. Debriefing was provided after the training session to reinforce correct skills and critical thinking. Follow-up surveys were completed after training. Results. Fifteen families participated. All parents were confident in changing tracheostomies, knowing signs of breathing difficulties, and responding to alarms. 71% strongly agree that simulation resulted in feeling better prepared to care for their child. 86% felt simulation improved their confidence in taking care of their child. Conclusion. Simulators provide a crucial transition between learned skills and application. This novel use of simulation-based education improves parents' confidence in emergencies and may lead to shortened training resulting in cost savings.
NASA Technical Reports Server (NTRS)
Kawamura, K.; Beale, G. O.; Schaffer, J. D.; Hsieh, B. J.; Padalkar, S.; Rodriguez-Moscoso, J. J.
1985-01-01
A reference manual is provided for NESS, a simulation expert system. This manual gives user information regarding starting and operating NASA expert simulation system (NESS). This expert system provides an intelligent interface to a generic simulation program for spacecraft attitude control problems. A menu of the functions the system can perform is provided. Control repeated returns to this menu after executing each user request.
Advanced Space Shuttle simulation model
NASA Technical Reports Server (NTRS)
Tatom, F. B.; Smith, S. R.
1982-01-01
A non-recursive model (based on von Karman spectra) for atmospheric turbulence along the flight path of the shuttle orbiter was developed. It provides for simulation of instantaneous vertical and horizontal gusts at the vehicle center-of-gravity, and also for simulation of instantaneous gusts gradients. Based on this model the time series for both gusts and gust gradients were generated and stored on a series of magnetic tapes, entitled Shuttle Simulation Turbulence Tapes (SSTT). The time series are designed to represent atmospheric turbulence from ground level to an altitude of 120,000 meters. A description of the turbulence generation procedure is provided. The results of validating the simulated turbulence are described. Conclusions and recommendations are presented. One-dimensional von Karman spectra are tabulated, while a discussion of the minimum frequency simulated is provided. The results of spectral and statistical analyses of the SSTT are presented.
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience
Stockton, David B.; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175
NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.
Stockton, David B; Santamaria, Fidel
2015-01-01
We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.
A seat cushion to provide realistic acceleration cues for aircraft simulators
NASA Technical Reports Server (NTRS)
Ashworth, B. R.
1976-01-01
A seat cushion to provide acceleration cues for aircraft simulator pilots was built, performance tested, and evaluated. The four cell seat, using a thin air cushion with highly responsive pressure control, attempts to reproduce the same events which occur in an aircraft seat under acceleration loading. The pressure controller provides seat cushion responses which are considered adequate for current high performance aircraft simulations. The initial tests of the seat cushions have resulted in excellent pilot opinion of the cushion's ability to provide realistic and useful cues to the simulator pilot.
Neuronvisio: A Graphical User Interface with 3D Capabilities for NEURON.
Mattioni, Michele; Cohen, Uri; Le Novère, Nicolas
2012-01-01
The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. Neuronvisio also facilitates access to previously published models, allowing users to browse, download, and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation, and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.
A novel approach for connecting temporal-ontologies with blood flow simulations.
Weichert, F; Mertens, C; Walczak, L; Kern-Isberner, G; Wagner, M
2013-06-01
In this paper an approach for developing a temporal domain ontology for biomedical simulations is introduced. The ideas are presented in the context of simulations of blood flow in aneurysms using the Lattice Boltzmann Method. The advantages in using ontologies are manyfold: On the one hand, ontologies having been proven to be able to provide medical special knowledge e.g., key parameters for simulations. On the other hand, based on a set of rules and the usage of a reasoner, a system for checking the plausibility as well as tracking the outcome of medical simulations can be constructed. Likewise, results of simulations including data derived from them can be stored and communicated in a way that can be understood by computers. Later on, this set of results can be analyzed. At the same time, the ontologies provide a way to exchange knowledge between researchers. Lastly, this approach can be seen as a black-box abstraction of the internals of the simulation for the biomedical researcher as well. This approach is able to provide the complete parameter sets for simulations, part of the corresponding results and part of their analysis as well as e.g., geometry and boundary conditions. These inputs can be transferred to different simulation methods for comparison. Variations on the provided parameters can be automatically used to drive these simulations. Using a rule base, unphysical inputs or outputs of the simulation can be detected and communicated to the physician in a suitable and familiar way. An example for an instantiation of the blood flow simulation ontology and exemplary rules for plausibility checking are given. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Struzenberg, L. L.; West, J. S.
2011-01-01
This paper describes the use of targeted Loci/CHEM CFD simulations to evaluate the effects of a dual-engine first-stage hot-fire test on an evolving integrated launch pad/test article design. This effort was undertaken as a part of the NESC Independent Assessment of the Taurus II Stage Test Series. The underlying conceptual model included development of a series of computational models and simulations to analyze the plume induced environments on the pad, facility structures and test article. A pathfinder simulation was first developed, capable of providing quick-turn around evaluation of plume impingement pressures on the flame deflector. Results from this simulation were available in time to provide data for an ongoing structural assessment of the deflector. The resulting recommendation was available in a timely manner and was incorporated into construction schedule for the new launch stand under construction at Wallops Flight Facility. A series of Reynolds-Averaged Navier-Stokes (RANS) quasi-steady simulations representative of various key elements of the test profile was performed to identify potential concerns with the test configuration and test profile. As required, unsteady Hybrid-RANS/LES simulations were performed, to provide additional insight into critical aspects of the test sequence. Modifications to the test-specific hardware and facility structures thermal protection as well as modifications to the planned hot-fire test profile were implemented based on these simulation results.
Predicting mesoscale microstructural evolution in electron beam welding
Rodgers, Theron M.; Madison, Jonathan D.; Tikare, Veena; ...
2016-03-16
Using the kinetic Monte Carlo simulator, Stochastic Parallel PARticle Kinetic Simulator, from Sandia National Laboratories, a user routine has been developed to simulate mesoscale predictions of a grain structure near a moving heat source. Here, we demonstrate the use of this user routine to produce voxelized, synthetic, three-dimensional microstructures for electron-beam welding by comparing them with experimentally produced microstructures. When simulation input parameters are matched to experimental process parameters, qualitative and quantitative agreement for both grain size and grain morphology are achieved. The method is capable of simulating both single- and multipass welds. As a result, the simulations provide anmore » opportunity for not only accelerated design but also the integration of simulation and experiments in design such that simulations can receive parameter bounds from experiments and, in turn, provide predictions of a resultant microstructure.« less
Cashin, Cheryl; Phuong, Nguyen Khanh; Shain, Ryan; Oanh, Tran Thi Mai; Thuy, Nguyen Thi
2015-01-01
Vietnam is currently considering a revision of its 2008 Health Insurance Law, including the regulation of provider payment methods. This study uses a simple spreadsheet-based, micro-simulation model to analyse the potential impacts of different provider payment reform scenarios on resource allocation across health care providers in three provinces in Vietnam, as well as on the total expenditure of the provincial branches of the public health insurance agency (Provincial Social Security [PSS]). The results show that currently more than 50% of PSS spending is concentrated at the provincial level with less than half at the district level. There is also a high degree of financial risk on district hospitals with the current fund-holding arrangement. Results of the simulation model show that several alternative scenarios for provider payment reform could improve the current payment system by reducing the high financial risk currently borne by district hospitals without dramatically shifting the current level and distribution of PSS expenditure. The results of the simulation analysis provided an empirical basis for health policy-makers in Vietnam to assess different provider payment reform options and make decisions about new models to support health system objectives.
NASA Astrophysics Data System (ADS)
Elbakary, Mohamed I.; Iftekharuddin, Khan M.; Papelis, Yiannis; Newman, Brett
2017-05-01
Air Traffic Management (ATM) concepts are commonly tested in simulation to obtain preliminary results and validate the concepts before adoption. Recently, the researchers found that simulation is not enough because of complexity associated with ATM concepts. In other words, full-scale tests must eventually take place to provide compelling performance evidence before adopting full implementation. Testing using full-scale aircraft produces a high-cost approach that yields high-confidence results but simulation provides a low-risk/low-cost approach with reduced confidence on the results. One possible approach to increase the confidence of the results and simultaneously reduce the risk and the cost is using unmanned sub-scale aircraft in testing new concepts for ATM. This paper presents the simulation results of using unmanned sub-scale aircraft in implementing ATM concepts compared to the full scale aircraft. The results of simulation show that the performance of sub-scale is quite comparable to that of the full-scale which validates use of the sub-scale in testing new ATM concepts. Keywords: Unmanned
Genetic Adaptive Control for PZT Actuators
NASA Technical Reports Server (NTRS)
Kim, Jeongwook; Stover, Shelley K.; Madisetti, Vijay K.
1995-01-01
A piezoelectric transducer (PZT) is capable of providing linear motion if controlled correctly and could provide a replacement for traditional heavy and large servo systems using motors. This paper focuses on a genetic model reference adaptive control technique (GMRAC) for a PZT which is moving a mirror where the goal is to keep the mirror velocity constant. Genetic Algorithms (GAs) are an integral part of the GMRAC technique acting as the search engine for an optimal PID controller. Two methods are suggested to control the actuator in this research. The first one is to change the PID parameters and the other is to add an additional reference input in the system. The simulation results of these two methods are compared. Simulated Annealing (SA) is also used to solve the problem. Simulation results of GAs and SA are compared after simulation. GAs show the best result according to the simulation results. The entire model is designed using the Mathworks' Simulink tool.
Sechopoulos, Ioannis; Ali, Elsayed S M; Badal, Andreu; Badano, Aldo; Boone, John M; Kyprianou, Iacovos S; Mainegra-Hing, Ernesto; McMillan, Kyle L; McNitt-Gray, Michael F; Rogers, D W O; Samei, Ehsan; Turner, Adam C
2015-10-01
The use of Monte Carlo simulations in diagnostic medical imaging research is widespread due to its flexibility and ability to estimate quantities that are challenging to measure empirically. However, any new Monte Carlo simulation code needs to be validated before it can be used reliably. The type and degree of validation required depends on the goals of the research project, but, typically, such validation involves either comparison of simulation results to physical measurements or to previously published results obtained with established Monte Carlo codes. The former is complicated due to nuances of experimental conditions and uncertainty, while the latter is challenging due to typical graphical presentation and lack of simulation details in previous publications. In addition, entering the field of Monte Carlo simulations in general involves a steep learning curve. It is not a simple task to learn how to program and interpret a Monte Carlo simulation, even when using one of the publicly available code packages. This Task Group report provides a common reference for benchmarking Monte Carlo simulations across a range of Monte Carlo codes and simulation scenarios. In the report, all simulation conditions are provided for six different Monte Carlo simulation cases that involve common x-ray based imaging research areas. The results obtained for the six cases using four publicly available Monte Carlo software packages are included in tabular form. In addition to a full description of all simulation conditions and results, a discussion and comparison of results among the Monte Carlo packages and the lessons learned during the compilation of these results are included. This abridged version of the report includes only an introductory description of the six cases and a brief example of the results of one of the cases. This work provides an investigator the necessary information to benchmark his/her Monte Carlo simulation software against the reference cases included here before performing his/her own novel research. In addition, an investigator entering the field of Monte Carlo simulations can use these descriptions and results as a self-teaching tool to ensure that he/she is able to perform a specific simulation correctly. Finally, educators can assign these cases as learning projects as part of course objectives or training programs.
1993-12-01
5-6 5.6.1 Large Cycle Slip Simulation ............................. 5-7 5.6.2 Small Cycle Slip Simulation ........................... 5-9...Appendix J. Small Cycle Slip Simulation Results ............................. J-1 Bibliography ........................................................ BIB-I...when subjected to large and small cycle slips. Results of the simulations indicate that the PNRS can provide an improved navigation solution over
Incorporating Haptic Feedback in Simulation for Learning Physics
ERIC Educational Resources Information Center
Han, Insook; Black, John B.
2011-01-01
The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…
Modified current follower-based immittance function simulators
NASA Astrophysics Data System (ADS)
Alpaslan, Halil; Yuce, Erkan
2017-12-01
In this paper, four immittance function simulators consisting of a single modified current follower with single Z- terminal and a minimum number of passive components are proposed. The first proposed circuit can provide +L parallel with +R and the second proposed one can realise -L parallel with -R. The third proposed structure can provide +L series with +R and the fourth proposed one can realise -L series with -R. However, all the proposed immittance function simulators need a single resistive matching constraint. Parasitic impedance effects on all the proposed immittance function simulators are investigated. A second-order current-mode (CM) high-pass filter derived from the first proposed immittance function simulator is given as an application example. Also, a second-order CM low-pass filter derived from the third proposed immittance function simulator is given as an application example. A number of simulation results based on SPICE programme and an experimental test result are given to verify the theory.
Modular, high power, variable R dynamic electrical load simulator
NASA Technical Reports Server (NTRS)
Joncas, K. P.
1974-01-01
The design of a previously developed basic variable R load simulator was entended to increase its power dissipation and transient handling capabilities. The delivered units satisfy all design requirements, and provides for a high power, modular simulation capability uniquely suited to the simulation of complex load responses. In addition to presenting conclusions and recommendations and pertinent background information, the report covers program accomplishments; describes the simulator basic circuits, transfer characteristic, protective features, assembly, and specifications; indicates the results of simulator evaluation, including burn-in and acceptance testing; provides acceptance test data; and summarizes the monthly progress reports.
Computer Simulation of the Circulation Subsystem of a Library
ERIC Educational Resources Information Center
Shaw, W. M., Jr.
1975-01-01
When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)
NASA Technical Reports Server (NTRS)
1979-01-01
The requirements for a new research aircraft to provide in-flight V/STOL simulation were reviewed. The required capabilities were based on known limitations of ground based simulation and past/current experience with V/STOL inflight simulation. Results indicate that V/STOL inflight simulation capability is needed to aid in the design and development of high performance V/STOL aircraft. Although a new research V/STOL aircraft is preferred, an interim solution can be provided by use of the X-22A, the CH-47B, or the 4AV-8B aircraft modified for control/display flight research.
Efficient scatter model for simulation of ultrasound images from computed tomography data
NASA Astrophysics Data System (ADS)
D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.
2015-12-01
Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.
Free-space optical channel simulator for weak-turbulence conditions.
Bykhovsky, Dima
2015-11-01
Free-space optical (FSO) communication may be severely influenced by the inevitable turbulence effect that results in channel gain fluctuations and fading. The objective of this paper is to provide a simple and effective simulator of the weak-turbulence FSO channel that emulates the influence of the temporal covariance effect. Specifically, the proposed model is based on lognormal distributed samples with a corresponding correlation time. The simulator is based on the solution of the first-order stochastic differential equation (SDE). The results of the provided SDE analysis reveal its efficacy for turbulent channel modeling.
Jones, Jana L; Rinehart, Jim; Spiegel, Jacqueline Jordan; Englar, Ryane E; Sidaway, Brian K; Rowles, Joie
2018-01-01
Anesthesia simulations have been used in pre-clinical medical training for decades to help learners gain confidence and expertise in an operating room environment without danger to a live patient. The authors describe a veterinary anesthesia simulation environment (VASE) with anesthesia scenarios developed to provide a re-creation of a veterinarian's task environment while performing anesthesia. The VASE uses advanced computer technology with simulator inputs provided from standard monitoring equipment in common use during veterinary anesthesia and a commercial canine training mannequin that allows intubation, ventilation, and venous access. The simulation outputs are determined by a script that outlines routine anesthesia scenarios and describes the consequences of students' hands-on actions and interventions during preestablished anesthetic tasks and critical incidents. Patients' monitored physiologic parameters may be changed according to predetermined learner events and students' interventions to provide immediate learner feedback and clinical realism. A total of 96 students from the pre-clinical anesthesia course participated in the simulations and the pre- and post-simulation surveys evaluating students' perspectives. Results of the surveys and comparisons of overall categorical cumulative responses in the pre- and post-simulation surveys indicated improvement in learners' perceived preparedness and confidence as a result of the simulated anesthesia experience, with significant improvement in the strongly agree, moderately agree, and agree categories (p<.05 at a 95% CI). These results suggest that anesthesia simulations in the VASE may complement traditional teaching methods through experiential learning and may help foster classroom-to-clinic transference of knowledge and skills without harm to an animal.
Passive scalar entrainment and mixing in a forced, spatially-developing mixing layer
NASA Technical Reports Server (NTRS)
Lowery, P. S.; Reynolds, W. C.; Mansour, N. N.
1987-01-01
Numerical simulations are performed for the forced, spatially-developing plane mixing layer in two and three dimensions. Transport of a passive scalar field is included in the computation. This, together with the allowance for spatial development in the simulations, affords the opportunity for study of the asymmetric entrainment of irrotational fluid into the layer. The inclusion of a passive scalar field provides a means for simulating the effect of this entrainment asymmetry on the generation of 'products' from a 'fast' chemical reaction. Further, the three-dimensional simulations provide useful insight into the effect of streamwise structures on these entrainment and 'fast' reaction processes. Results from a two-dimensional simulation indicate 1.22 parts high-speed fluid are entrained for every one part low-speed fluid. Inclusion of streamwise vortices at the inlet plane of a three-dimensional simulation indicate a further increase in asymmetric entrainment - 1.44:1. Results from a final three-dimensional simulation are presented. In this case, a random velocity perturbation is imposed at the inlet plane. The results indicate the 'natural' development of the large spanwise structures characteristic of the mixing layer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stimpson, Shane G; Powers, Jeffrey J; Clarno, Kevin T
The Consortium for Advanced Simulation of Light Water Reactors (CASL) aims to provide high-fidelity, multiphysics simulations of light water reactors (LWRs) by coupling a variety of codes within the Virtual Environment for Reactor Analysis (VERA). One of the primary goals of CASL is to predict local cladding failure through pellet-clad interaction (PCI). This capability is currently being pursued through several different approaches, such as with Tiamat, which is a simulation tool within VERA that more tightly couples the MPACT neutron transport solver, the CTF thermal hydraulics solver, and the MOOSE-based Bison-CASL fuel performance code. However, the process in this papermore » focuses on running fuel performance calculations with Bison-CASL to predict PCI using the multicycle output data from coupled neutron transport/thermal hydraulics simulations. In recent work within CASL, Watts Bar Unit 1 has been simulated over 12 cycles using the VERA core simulator capability based on MPACT and CTF. Using the output from these simulations, Bison-CASL results can be obtained without rerunning all 12 cycles, while providing some insight into PCI indicators. Multi-cycle Bison-CASL results are presented and compared against results from the FRAPCON fuel performance code. There are several quantities of interest in considering PCI and subsequent fuel rod failures, such as the clad hoop stress and maximum centerline fuel temperature, particularly as a function of time. Bison-CASL performs single-rod simulations using representative power and temperature distributions, providing high-resolution results for these and a number of other quantities. This will assist in identifying fuels rods as potential failure locations for use in further analyses.« less
Simulation-based Education to Ensure Provider Competency Within the Health Care System.
Griswold, Sharon; Fralliccardi, Alise; Boulet, John; Moadel, Tiffany; Franzen, Douglas; Auerbach, Marc; Hart, Danielle; Goswami, Varsha; Hui, Joshua; Gordon, James A
2018-02-01
The acquisition and maintenance of individual competency is a critical component of effective emergency care systems. This article summarizes consensus working group deliberations and recommendations focusing on the topic "Simulation-based education to ensure provider competency within the healthcare system." The authors presented this work for discussion and feedback at the 2017 Academic Emergency Medicine Consensus Conference on "Catalyzing System Change Through Healthcare Simulation: Systems, Competency, and Outcomes," held on May 16, 2017, in Orlando, Florida. Although simulation-based training is a quality and safety imperative in other high-reliability professions such as aviation, nuclear power, and the military, health care professions still lag behind in applying simulation more broadly. This is likely a result of a number of factors, including cost, assessment challenges, and resistance to change. This consensus subgroup focused on identifying current gaps in knowledge and process related to the use of simulation for developing, enhancing, and maintaining individual provider competency. The resulting product is a research agenda informed by expert consensus and literature review. © 2017 by the Society for Academic Emergency Medicine.
Let's get honest about sampling.
Mobley, David L
2012-01-01
Molecular simulations see widespread and increasing use in computation and molecular design, especially within the area of molecular simulations applied to biomolecular binding and interactions, our focus here. However, force field accuracy remains a concern for many practitioners, and it is often not clear what level of accuracy is really needed for payoffs in a discovery setting. Here, I argue that despite limitations of today's force fields, current simulation tools and force fields now provide the potential for real benefits in a variety of applications. However, these same tools also provide irreproducible results which are often poorly interpreted. Continued progress in the field requires more honesty in assessment and care in evaluation of simulation results, especially with respect to convergence.
Hyper-X Stage Separation Trajectory Validation Studies
NASA Technical Reports Server (NTRS)
Tartabini, Paul V.; Bose, David M.; McMinn, John D.; Martin, John G.; Strovers, Brian K.
2003-01-01
An independent twelve degree-of-freedom simulation of the X-43A separation trajectory was created with the Program to Optimize Simulated trajectories (POST II). This simulation modeled the multi-body dynamics of the X-43A and its booster and included the effect of two pyrotechnically actuated pistons used to push the vehicles apart as well as aerodynamic interaction forces and moments between the two vehicles. The simulation was developed to validate trajectory studies conducted with a 14 degree-of-freedom simulation created early in the program using the Automatic Dynamic Analysis of Mechanics Systems (ADAMS) simulation software. The POST simulation was less detailed than the official ADAMS-based simulation used by the Project, but was simpler, more concise and ran faster, while providing similar results. The increase in speed provided by the POST simulation provided the Project with an alternate analysis tool. This tool was ideal for performing separation control logic trade studies that required the running of numerous Monte Carlo trajectories.
Magnetized Mini-Disk Simulations about Binary Black Holes
NASA Astrophysics Data System (ADS)
Noble, Scott; Bowen, Dennis B.; d'Ascoli, Stephane; Mewes, Vassilios; Campanelli, Manuela; Krolik, Julian
2018-01-01
Accretion disks around supermassive binary black holes offer a rare opportunity to probe the strong-field limit of dynamical gravity by using the ambient matter as a lighthouse. Accurate simulations of these systems using a variety of configurations will be critical to interpreting future observations of them. We have performed the first 3-d general relativistic magnetohydrodynamic simulations of mini-disks about a pair of equal mass black holes in the inspiral regime of their orbit. In this talk, we will present our latest results of 3-d general relativistic magnetohydrodynamic supercomputer simulations of accreting binary black holes during the post-Newtonian inspiral phase of their evolution. The goal of our work is to explore whether these systems provide a unique means to identify and characterize them with electromagnetic observations. We will provide a brief summary of the known electromagnetic signatures, in particular spectra and images obtained from post-process ray-tracing calculations of our simulation data. We will also provide a context for our results and describe our future avenues of exploration.
System analysis for the Huntsville Operational Support Center distributed computer system
NASA Technical Reports Server (NTRS)
Ingels, F. M.; Mauldin, J.
1984-01-01
The Huntsville Operations Support Center (HOSC) is a distributed computer system used to provide real time data acquisition, analysis and display during NASA space missions and to perform simulation and study activities during non-mission times. The primary purpose is to provide a HOSC system simulation model that is used to investigate the effects of various HOSC system configurations. Such a model would be valuable in planning the future growth of HOSC and in ascertaining the effects of data rate variations, update table broadcasting and smart display terminal data requirements on the HOSC HYPERchannel network system. A simulation model was developed in PASCAL and results of the simulation model for various system configuraions were obtained. A tutorial of the model is presented and the results of simulation runs are presented. Some very high data rate situations were simulated to observe the effects of the HYPERchannel switch over from contention to priority mode under high channel loading.
Modelling rollover behaviour of exacavator-based forest machines
M.W. Veal; S.E. Taylor; Robert B. Rummer
2003-01-01
This poster presentation provides results from analytical and computer simulation models of rollover behaviour of hydraulic excavators. These results are being used as input to the operator protective structure standards development process. Results from rigid body mechanics and computer simulation methods agree well with field rollover test data. These results show...
A Flexible System for Simulating Aeronautical Telecommunication Network
NASA Technical Reports Server (NTRS)
Maly, Kurt; Overstreet, C. M.; Andey, R.
1998-01-01
At Old Dominion University, we have built Aeronautical Telecommunication Network (ATN) Simulator with NASA being the fund provider. It provides a means to evaluate the impact of modified router scheduling algorithms on the network efficiency, to perform capacity studies on various network topologies and to monitor and study various aspects of ATN through graphical user interface (GUI). In this paper we describe briefly about the proposed ATN model and our abstraction of this model. Later we describe our simulator architecture highlighting some of the design specifications, scheduling algorithms and user interface. At the end, we have provided the results of performance studies on this simulator.
NASA Astrophysics Data System (ADS)
Devanand, Anjana; Ghosh, Subimal; Paul, Supantha; Karmakar, Subhankar; Niyogi, Dev
2018-06-01
Regional simulations of the seasonal Indian summer monsoon rainfall (ISMR) require an understanding of the model sensitivities to physics and resolution, and its effect on the model uncertainties. It is also important to quantify the added value in the simulated sub-regional precipitation characteristics by a regional climate model (RCM), when compared to coarse resolution rainfall products. This study presents regional model simulations of ISMR at seasonal scale using the Weather Research and Forecasting (WRF) model with the synoptic scale forcing from ERA-interim reanalysis, for three contrasting monsoon seasons, 1994 (excess), 2002 (deficit) and 2010 (normal). Impact of four cumulus schemes, viz., Kain-Fritsch (KF), Betts-Janjić-Miller, Grell 3D and modified Kain-Fritsch (KFm), and two micro physical parameterization schemes, viz., WRF Single Moment Class 5 scheme and Lin et al. scheme (LIN), with eight different possible combinations are analyzed. The impact of spectral nudging on model sensitivity is also studied. In WRF simulations using spectral nudging, improvement in model rainfall appears to be consistent in regions with topographic variability such as Central Northeast and Konkan Western Ghat sub-regions. However the results are also dependent on choice of cumulus scheme used, with KF and KFm providing relatively good performance and the eight member ensemble mean showing better results for these sub-regions. There is no consistent improvement noted in Northeast and Peninsular Indian monsoon regions. Results indicate that the regional simulations using nested domains can provide some improvements on ISMR simulations. Spectral nudging is found to improve upon the model simulations in terms of reducing the intra ensemble spread and hence the uncertainty in the model simulated precipitation. The results provide important insights regarding the need for further improvements in the regional climate simulations of ISMR for various sub-regions and contribute to the understanding of the added value in seasonal simulations by RCMs.
NASA Astrophysics Data System (ADS)
Devanand, Anjana; Ghosh, Subimal; Paul, Supantha; Karmakar, Subhankar; Niyogi, Dev
2017-08-01
Regional simulations of the seasonal Indian summer monsoon rainfall (ISMR) require an understanding of the model sensitivities to physics and resolution, and its effect on the model uncertainties. It is also important to quantify the added value in the simulated sub-regional precipitation characteristics by a regional climate model (RCM), when compared to coarse resolution rainfall products. This study presents regional model simulations of ISMR at seasonal scale using the Weather Research and Forecasting (WRF) model with the synoptic scale forcing from ERA-interim reanalysis, for three contrasting monsoon seasons, 1994 (excess), 2002 (deficit) and 2010 (normal). Impact of four cumulus schemes, viz., Kain-Fritsch (KF), Betts-Janjić-Miller, Grell 3D and modified Kain-Fritsch (KFm), and two micro physical parameterization schemes, viz., WRF Single Moment Class 5 scheme and Lin et al. scheme (LIN), with eight different possible combinations are analyzed. The impact of spectral nudging on model sensitivity is also studied. In WRF simulations using spectral nudging, improvement in model rainfall appears to be consistent in regions with topographic variability such as Central Northeast and Konkan Western Ghat sub-regions. However the results are also dependent on choice of cumulus scheme used, with KF and KFm providing relatively good performance and the eight member ensemble mean showing better results for these sub-regions. There is no consistent improvement noted in Northeast and Peninsular Indian monsoon regions. Results indicate that the regional simulations using nested domains can provide some improvements on ISMR simulations. Spectral nudging is found to improve upon the model simulations in terms of reducing the intra ensemble spread and hence the uncertainty in the model simulated precipitation. The results provide important insights regarding the need for further improvements in the regional climate simulations of ISMR for various sub-regions and contribute to the understanding of the added value in seasonal simulations by RCMs.
Interactive Simulations as Implicit Support for Guided-Inquiry
ERIC Educational Resources Information Center
Moore, Emily B.; Herzog, Timothy A.; Perkins, Katherine K.
2013-01-01
We present the results of a study designed to provide insight into interactive simulation use during guided-inquiry activities in chemistry classes. The PhET Interactive Simulations project at the University of Colorado develops interactive simulations that utilize implicit--rather than explicit--scaffolding to support student learning through…
NASA Technical Reports Server (NTRS)
Kubat, Gregory
2016-01-01
This report addresses a deliverable to the UAS-in-the-NAS project for recommendations for integration of CNPC and ATC communications based on analysis results from modeled radio system and NAS-wide UA communication architecture simulations. For each recommendation, a brief explanation of the rationale for its consideration is provided with any supporting results obtained or observed in our simulation activity.
Classical Molecular Dynamics Simulation of Nuclear Fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Devanathan, Ram; Krack, Matthias; Bertolus, Marjorie
2015-10-10
Molecular dynamics simulation is well suited to study primary damage production by irradiation, defect interactions with fission gas atoms, gas bubble nucleation, grain boundary effects on defect and gas bubble evolution in nuclear fuel, and the resulting changes in thermo-mechanical properties. In these simulations, the forces on the ions are dictated by interaction potentials generated by fitting properties of interest to experimental data. The results obtained from the present generation of potentials are qualitatively similar, but quantitatively different. There is a need to refine existing potentials to provide a better representation of the performance of polycrystalline fuel under a varietymore » of operating conditions, and to develop models that are equipped to handle deviations from stoichiometry. In addition to providing insights into fundamental mechanisms governing the behaviour of nuclear fuel, MD simulations can also provide parameters that can be used as inputs for mesoscale models.« less
Demonstration of an Aerocapture GN and C System Through Hardware-in-the-Loop Simulations
NASA Technical Reports Server (NTRS)
Masciarelli, James; Deppen, Jennifer; Bladt, Jeff; Fleck, Jeff; Lawson, Dave
2010-01-01
Aerocapture is an orbit insertion maneuver in which a spacecraft flies through a planetary atmosphere one time using drag force to decelerate and effect a hyperbolic to elliptical orbit change. Aerocapture employs a feedback Guidance, Navigation, and Control (GN&C) system to deliver the spacecraft into a precise postatmospheric orbit despite the uncertainties inherent in planetary atmosphere knowledge, entry targeting and aerodynamic predictions. Only small amounts of propellant are required for attitude control and orbit adjustments, thereby providing mass savings of hundreds to thousands of kilograms over conventional all-propulsive techniques. The Analytic Predictor Corrector (APC) guidance algorithm has been developed to steer the vehicle through the aerocapture maneuver using bank angle control. Through funding provided by NASA's In-Space Propulsion Technology Program, the operation of an aerocapture GN&C system has been demonstrated in high-fidelity simulations that include real-time hardware in the loop, thus increasing the Technology Readiness Level (TRL) of aerocapture GN&C. First, a non-real-time (NRT), 6-DOF trajectory simulation was developed for the aerocapture trajectory. The simulation included vehicle dynamics, gravity model, atmosphere model, aerodynamics model, inertial measurement unit (IMU) model, attitude control thruster torque models, and GN&C algorithms (including the APC aerocapture guidance). The simulation used the vehicle and mission parameters from the ST-9 mission. A 2000 case Monte Carlo simulation was performed and results show an aerocapture success rate of greater than 99.7%, greater than 95% of total delta-V required for orbit insertion is provided by aerodynamic drag, and post-aerocapture orbit plane wedge angle error is less than 0.5 deg (3-sigma). Then a real-time (RT), 6-DOF simulation for the aerocapture trajectory was developed which demonstrated the guidance software executing on a flight-like computer, interfacing with a simulated IMU and simulated thrusters, with vehicle dynamics provided by an external simulator. Five cases from the NRT simulations were run in the RT simulation environment. The results compare well to those of the NRT simulation thus verifying the RT simulation configuration. The results of the above described simulations show the aerocapture maneuver using the APC algorithm can be accomplished reliably and the algorithm is now at TRL-6. Flight validation is the next step for aerocapture technology development.
Cognitive simulators for medical education and training.
Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L
2009-08-01
Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.
Perspectives on the simulation of protein–surface interactions using empirical force field methods
Latour, Robert A.
2014-01-01
Protein–surface interactions are of fundamental importance for a broad range of applications in the fields of biomaterials and biotechnology. Present experimental methods are limited in their ability to provide a comprehensive depiction of these interactions at the atomistic level. In contrast, empirical force field based simulation methods inherently provide the ability to predict and visualize protein–surface interactions with full atomistic detail. These methods, however, must be carefully developed, validated, and properly applied before confidence can be placed in results from the simulations. In this perspectives paper, I provide an overview of the critical aspects that I consider being of greatest importance for the development of these methods, with a focus on the research that my combined experimental and molecular simulation groups have conducted over the past decade to address these issues. These critical issues include the tuning of interfacial force field parameters to accurately represent the thermodynamics of interfacial behavior, adequate sampling of these types of complex molecular systems to generate results that can be comparable with experimental data, and the generation of experimental data that can be used for simulation results evaluation and validation. PMID:25028242
MaGate Simulator: A Simulation Environment for a Decentralized Grid Scheduler
NASA Astrophysics Data System (ADS)
Huang, Ye; Brocco, Amos; Courant, Michele; Hirsbrunner, Beat; Kuonen, Pierre
This paper presents a simulator for of a decentralized modular grid scheduler named MaGate. MaGate’s design emphasizes scheduler interoperability by providing intelligent scheduling serving the grid community as a whole. Each MaGate scheduler instance is able to deal with dynamic scheduling conditions, with continuously arriving grid jobs. Received jobs are either allocated on local resources, or delegated to other MaGates for remote execution. The proposed MaGate simulator is based on GridSim toolkit and Alea simulator, and abstracts the features and behaviors of complex fundamental grid elements, such as grid jobs, grid resources, and grid users. Simulation of scheduling tasks is supported by a grid network overlay simulator executing distributed ant-based swarm intelligence algorithms to provide services such as group communication and resource discovery. For evaluation, a comparison of behaviors of different collaborative policies among a community of MaGates is provided. Results support the use of the proposed approach as a functional ready grid scheduler simulator.
Dual Arm Work Package performance estimates and telerobot task network simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draper, J.V.; Blair, L.M.
1997-02-01
This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less
Preliminary Study of Image Reconstruction Algorithm on a Digital Signal Processor
2014-03-01
5.2 Comparison of CPU-GPU, CPU-FPGA, and CPU-DSP Designs The work for implementing VHDL description of the back-projection algorithm on a physical...FPGA was not complete. Hence, the DSP implementation results are compared with the simulated results for the VHDL design. Simulating VHDL provides an...rather than at the software level. Depending on an application’s characteristics, FPGA implementations can provide a significant performance
A simulation model for probabilistic analysis of Space Shuttle abort modes
NASA Technical Reports Server (NTRS)
Hage, R. T.
1993-01-01
A simulation model which was developed to provide a probabilistic analysis tool to study the various space transportation system abort mode situations is presented. The simulation model is based on Monte Carlo simulation of an event-tree diagram which accounts for events during the space transportation system's ascent and its abort modes. The simulation model considers just the propulsion elements of the shuttle system (i.e., external tank, main engines, and solid boosters). The model was developed to provide a better understanding of the probability of occurrence and successful completion of abort modes during the vehicle's ascent. The results of the simulation runs discussed are for demonstration purposes only, they are not official NASA probability estimates.
Pulse Shaped Constant Envelope 8-PSK Modulation Study
NASA Technical Reports Server (NTRS)
Tao, Jianping; Horan, Sheila
1997-01-01
This report provides simulation results for constant envelope pulse shaped 8 Level Phase Shift Keying (8 PSK) modulation for end to end system performance. In order to increase bandwidth utilization, pulse shaping is applied to signals before they are modulated. This report provides simulation results of power spectra and measurement of bit errors produced by pulse shaping in a non-linear channel with Additive White Gaussian Noise (AWGN). The pulse shaping filters can placed before (Type B) or after (Type A) signals are modulated. Three kinds of baseband filters, 5th order Butterworth, 3rd order Bessel and Square-Root Raised Cosine with different BTs or roll off factors, are utilized in the simulations. The simulations were performed on a Signal Processing Worksystem (SPW).
Gal, Gilad Ben; Weiss, Ervin I; Gafni, Naomi; Ziv, Amitai
2011-04-01
Virtual reality force feedback simulators provide a haptic (sense of touch) feedback through the device being held by the user. The simulator's goal is to provide a learning experience resembling reality. A newly developed haptic simulator (IDEA Dental, Las Vegas, NV, USA) was assessed in this study. Our objectives were to assess the simulator's ability to serve as a tool for dental instruction, self-practice, and student evaluation, as well as to evaluate the sensation it provides. A total of thirty-three evaluators were divided into two groups. The first group consisted of twenty-one experienced dental educators; the second consisted of twelve fifth-year dental students. Each participant performed drilling tasks using the simulator and filled out a questionnaire regarding the simulator and potential ways of using it in dental education. The results show that experienced dental faculty members as well as advanced dental students found that the simulator could provide significant potential benefits in the teaching and self-learning of manual dental skills. Development of the simulator's tactile sensation is needed to attune it to genuine sensation. Further studies relating to aspects of the simulator's structure and its predictive validity, its scoring system, and the nature of the performed tasks should be conducted.
NASA Astrophysics Data System (ADS)
Lynch, Cheryl L.; Graham, Geoff M.; Popovic, Milos R.
2011-08-01
Functional electrical stimulation (FES) applications are frequently evaluated in simulation prior to testing in human subjects. Such simulations are usually based on the typical muscle responses to electrical stimulation, which may result in an overly optimistic assessment of likely real-world performance. We propose a novel method for simulating FES applications that includes non-ideal muscle behaviour during electrical stimulation resulting from muscle fatigue, spasms and tremors. A 'non-idealities' block that can be incorporated into existing FES simulations and provides a realistic estimate of real-world performance is described. An implementation example is included, showing how the non-idealities block can be incorporated into a simulation of electrically stimulated knee extension against gravity for both a proportional-integral-derivative controller and a sliding mode controller. The results presented in this paper illustrate that the real-world performance of a FES system may be vastly different from the performance obtained in simulation using nominal muscle models. We believe that our non-idealities block should be included in future simulations that involve muscle response to FES, as this tool will provide neural engineers with a realistic simulation of the real-world performance of FES systems. This simulation strategy will help engineers and organizations save time and money by preventing premature human testing. The non-idealities block will become available free of charge at www.toronto-fes.ca in late 2011.
Ground motion simulations in Marmara (Turkey) region from 3D finite difference method
NASA Astrophysics Data System (ADS)
Aochi, Hideo; Ulrich, Thomas; Douglas, John
2016-04-01
In the framework of the European project MARSite (2012-2016), one of the main contributions from our research team was to provide ground-motion simulations for the Marmara region from various earthquake source scenarios. We adopted a 3D finite difference code, taking into account the 3D structure around the Sea of Marmara (including the bathymetry) and the sea layer. We simulated two moderate earthquakes (about Mw4.5) and found that the 3D structure improves significantly the waveforms compared to the 1D layer model. Simulations were carried out for different earthquakes (moderate point sources and large finite sources) in order to provide shake maps (Aochi and Ulrich, BSSA, 2015), to study the variability of ground-motion parameters (Douglas & Aochi, BSSA, 2016) as well as to provide synthetic seismograms for the blind inversion tests (Diao et al., GJI, 2016). The results are also planned to be integrated in broadband ground-motion simulations, tsunamis generation and simulations of triggered landslides (in progress by different partners). The simulations are freely shared among the partners via the internet and the visualization of the results is diffused on the project's homepage. All these simulations should be seen as a reference for this region, as they are based on the latest knowledge that obtained during the MARSite project, although their refinement and validation of the model parameters and the simulations are a continuing research task relying on continuing observations. The numerical code used, the models and the simulations are available on demand.
Monte Carlo simulations in Nuclear Medicine
NASA Astrophysics Data System (ADS)
Loudos, George K.
2007-11-01
Molecular imaging technologies provide unique abilities to localise signs of disease before symptoms appear, assist in drug testing, optimize and personalize therapy, and assess the efficacy of treatment regimes for different types of cancer. Monte Carlo simulation packages are used as an important tool for the optimal design of detector systems. In addition they have demonstrated potential to improve image quality and acquisition protocols. Many general purpose (MCNP, Geant4, etc) or dedicated codes (SimSET etc) have been developed aiming to provide accurate and fast results. Special emphasis will be given to GATE toolkit. The GATE code currently under development by the OpenGATE collaboration is the most accurate and promising code for performing realistic simulations. The purpose of this article is to introduce the non expert reader to the current status of MC simulations in nuclear medicine and briefly provide examples of current simulated systems, and present future challenges that include simulation of clinical studies and dosimetry applications.
Johnson, R.H.; Poeter, E.P.
2007-01-01
Perchloroethylene (PCE) saturations determined from GPR surveys were used as observations for inversion of multiphase flow simulations of a PCE injection experiment (Borden 9??m cell), allowing for the estimation of optimal bulk intrinsic permeability values. The resulting fit statistics and analysis of residuals (observed minus simulated PCE saturations) were used to improve the conceptual model. These improvements included adjustment of the elevation of a permeability contrast, use of the van Genuchten versus Brooks-Corey capillary pressure-saturation curve, and a weighting scheme to account for greater measurement error with larger saturation values. A limitation in determining PCE saturations through one-dimensional GPR modeling is non-uniqueness when multiple GPR parameters are unknown (i.e., permittivity, depth, and gain function). Site knowledge, fixing the gain function, and multiphase flow simulations assisted in evaluating non-unique conceptual models of PCE saturation, where depth and layering were reinterpreted to provide alternate conceptual models. Remaining bias in the residuals is attributed to the violation of assumptions in the one-dimensional GPR interpretation (which assumes flat, infinite, horizontal layering) resulting from multidimensional influences that were not included in the conceptual model. While the limitations and errors in using GPR data as observations for inverse multiphase flow simulations are frustrating and difficult to quantify, simulation results indicate that the error and bias in the PCE saturation values are small enough to still provide reasonable optimal permeability values. The effort to improve model fit and reduce residual bias decreases simulation error even for an inversion based on biased observations and provides insight into alternate GPR data interpretations. Thus, this effort is warranted and provides information on bias in the observation data when this bias is otherwise difficult to assess. ?? 2006 Elsevier B.V. All rights reserved.
Simulated Consulting Experiences in Counselor Preparation
ERIC Educational Resources Information Center
Panther, Edward E.
1971-01-01
Simulation, using role playing and commercially available materials, was used to provide counselor teacher consultation experience for counselor trainees. The results of the program supported the use of simulation as a technique for counselor education. Implications for counselor education programs are discussed. (Author/CG)
Building Better Planet Populations for EXOSIMS
NASA Astrophysics Data System (ADS)
Garrett, Daniel; Savransky, Dmitry
2018-01-01
The Exoplanet Open-Source Imaging Mission Simulator (EXOSIMS) software package simulates ensembles of space-based direct imaging surveys to provide a variety of science and engineering yield distributions for proposed mission designs. These mission simulations rely heavily on assumed distributions of planetary population parameters including semi-major axis, planetary radius, eccentricity, albedo, and orbital orientation to provide heuristics for target selection and to simulate planetary systems for detection and characterization. The distributions are encoded in PlanetPopulation modules within EXOSIMS which are selected by the user in the input JSON script when a simulation is run. The earliest written PlanetPopulation modules available in EXOSIMS are based on planet population models where the planetary parameters are considered to be independent from one another. While independent parameters allow for quick computation of heuristics and sampling for simulated planetary systems, results from planet-finding surveys have shown that many parameters (e.g., semi-major axis/orbital period and planetary radius) are not independent. We present new PlanetPopulation modules for EXOSIMS which are built on models based on planet-finding survey results where semi-major axis and planetary radius are not independent and provide methods for sampling their joint distribution. These new modules enhance the ability of EXOSIMS to simulate realistic planetary systems and give more realistic science yield distributions.
Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas
2011-12-15
The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined.
Reducing EnergyPlus Run Time For Code Compliance Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Athalye, Rahul A.; Gowri, Krishnan; Schultz, Robert W.
2014-09-12
Integration of the EnergyPlus ™ simulation engine into performance-based code compliance software raises a concern about simulation run time, which impacts timely feedback of compliance results to the user. EnergyPlus annual simulations for proposed and code baseline building models, and mechanical equipment sizing result in simulation run times beyond acceptable limits. This paper presents a study that compares the results of a shortened simulation time period using 4 weeks of hourly weather data (one per quarter), to an annual simulation using full 52 weeks of hourly weather data. Three representative building types based on DOE Prototype Building Models and threemore » climate zones were used for determining the validity of using a shortened simulation run period. Further sensitivity analysis and run time comparisons were made to evaluate the robustness and run time savings of using this approach. The results of this analysis show that the shortened simulation run period provides compliance index calculations within 1% of those predicted using annual simulation results, and typically saves about 75% of simulation run time.« less
PyNN: A Common Interface for Neuronal Network Simulators.
Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre
2008-01-01
Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.
PyNN: A Common Interface for Neuronal Network Simulators
Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre
2008-01-01
Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529
2011-01-01
Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research can be accurately described and combined. PMID:22172142
Establishing a Novel Modeling Tool: A Python-Based Interface for a Neuromorphic Hardware System
Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz
2008-01-01
Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated. PMID:19562085
Establishing a novel modeling tool: a python-based interface for a neuromorphic hardware system.
Brüderle, Daniel; Müller, Eric; Davison, Andrew; Muller, Eilif; Schemmel, Johannes; Meier, Karlheinz
2009-01-01
Neuromorphic hardware systems provide new possibilities for the neuroscience modeling community. Due to the intrinsic parallelism of the micro-electronic emulation of neural computation, such models are highly scalable without a loss of speed. However, the communities of software simulator users and neuromorphic engineering in neuroscience are rather disjoint. We present a software concept that provides the possibility to establish such hardware devices as valuable modeling tools. It is based on the integration of the hardware interface into a simulator-independent language which allows for unified experiment descriptions that can be run on various simulation platforms without modification, implying experiment portability and a huge simplification of the quantitative comparison of hardware and simulator results. We introduce an accelerated neuromorphic hardware device and describe the implementation of the proposed concept for this system. An example setup and results acquired by utilizing both the hardware system and a software simulator are demonstrated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.
There is a lack of state-of-the-art quantum computing simulation software that scales on heterogeneous systems like Titan. Tensor Network Quantum Virtual Machine (TNQVM) provides a quantum simulator that leverages a distributed network of GPUs to simulate quantum circuits in a manner that leverages recent results from tensor network theory.
Global Simulation of Aviation Operations
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Sheth, Kapil; Ng, Hok Kwan; Morando, Alex; Li, Jinhua
2016-01-01
The simulation and analysis of global air traffic is limited due to a lack of simulation tools and the difficulty in accessing data sources. This paper provides a global simulation of aviation operations combining flight plans and real air traffic data with historical commercial city-pair aircraft type and schedule data and global atmospheric data. The resulting capability extends the simulation and optimization functions of NASA's Future Air Traffic Management Concept Evaluation Tool (FACET) to global scale. This new capability is used to present results on the evolution of global air traffic patterns from a concentration of traffic inside US, Europe and across the Atlantic Ocean to a more diverse traffic pattern across the globe with accelerated growth in Asia, Australia, Africa and South America. The simulation analyzes seasonal variation in the long-haul wind-optimal traffic patterns in six major regions of the world and provides potential time-savings of wind-optimal routes compared with either great circle routes or current flight-plans if available.
Method for simulating discontinuous physical systems
Baty, Roy S.; Vaughn, Mark R.
2001-01-01
The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le, Hai D.
2017-03-02
SimEngine provides the core functionalities and components that are key to the development of discrete event simulation tools. These include events, activities, event queues, random number generators, and basic result tracking classes. SimEngine was designed for high performance, integrates seamlessly into any Microsoft .Net development environment, and provides a flexible API for simulation developers.
Tullis, Terry. E.; Richards-Dinger, Keith B.; Barall, Michael; Dieterich, James H.; Field, Edward H.; Heien, Eric M.; Kellogg, Louise; Pollitz, Fred F.; Rundle, John B.; Sachs, Michael K.; Turcotte, Donald L.; Ward, Steven N.; Yikilmaz, M. Burak
2012-01-01
In order to understand earthquake hazards we would ideally have a statistical description of earthquakes for tens of thousands of years. Unfortunately the ∼100‐year instrumental, several 100‐year historical, and few 1000‐year paleoseismological records are woefully inadequate to provide a statistically significant record. Physics‐based earthquake simulators can generate arbitrarily long histories of earthquakes; thus they can provide a statistically meaningful history of simulated earthquakes. The question is, how realistic are these simulated histories? This purpose of this paper is to begin to answer that question. We compare the results between different simulators and with information that is known from the limited instrumental, historic, and paleoseismological data.As expected, the results from all the simulators show that the observational record is too short to properly represent the system behavior; therefore, although tests of the simulators against the limited observations are necessary, they are not a sufficient test of the simulators’ realism. The simulators appear to pass this necessary test. In addition, the physics‐based simulators show similar behavior even though there are large differences in the methodology. This suggests that they represent realistic behavior. Different assumptions concerning the constitutive properties of the faults do result in enhanced capabilities of some simulators. However, it appears that the similar behavior of the different simulators may result from the fault‐system geometry, slip rates, and assumed strength drops, along with the shared physics of stress transfer.This paper describes the results of running four earthquake simulators that are described elsewhere in this issue of Seismological Research Letters. The simulators ALLCAL (Ward, 2012), VIRTCAL (Sachs et al., 2012), RSQSim (Richards‐Dinger and Dieterich, 2012), and ViscoSim (Pollitz, 2012) were run on our most recent all‐California fault model, allcal2. With the exception of ViscoSim, which ran for 10,000 years, all the simulators ran for 30,000 years. Presentations containing content similar to this paper can be found at http://scec.usc.edu/research/eqsims/.
H2LIFT: global navigation simulation ship tracking and WMD detection in the maritime domain
NASA Astrophysics Data System (ADS)
Wyffels, Kevin
2007-04-01
This paper presents initial results for a tracking simulation of multiple maritime vehicles for use in a data fusion program detecting Weapons of Mass Destruction (WMD). This simulation supports a fusion algorithm (H2LIFT) for collecting and analyzing data providing a heuristic analysis tool for detecting weapons of mass destruction in the maritime domain. Tools required to develop a navigational simulation fitting a set of project objectives are introduced for integration into the H2LIFT algorithm. Emphasis is placed on the specific requirements of the H2LIFT project, however the basic equations, algorithms, and methodologies can be used as tools in a variety of scenario simulations. Discussion will be focused on track modeling (e.g. position tracking of ships), navigational techniques, WMD detection, and simulation of these models using Matlab and Simulink. Initial results provide absolute ship position data for a given multi-ship maritime scenario with random generation of a given ship containing a WMD. Required coordinate systems, conversions between coordinate systems, Earth modeling techniques, and navigational conventions and techniques are introduced for development of the simulations.
MOCCA code for star cluster simulation: comparison with optical observations using COCOA
NASA Astrophysics Data System (ADS)
Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Olech, Arkadiusz; Hypki, Arkadiusz
2016-02-01
We introduce and present preliminary results from COCOA (Cluster simulatiOn Comparison with ObservAtions) code for a star cluster after 12 Gyr of evolution simulated using the MOCCA code. The COCOA code is being developed to quickly compare results of numerical simulations of star clusters with observational data. We use COCOA to obtain parameters of the projected cluster model. For comparison, a FITS file of the projected cluster was provided to observers so that they could use their observational methods and techniques to obtain cluster parameters. The results show that the similarity of cluster parameters obtained through numerical simulations and observations depends significantly on the quality of observational data and photometric accuracy.
A comparison of solute-transport solution techniques based on inverse modelling results
Mehl, S.; Hill, M.C.
2000-01-01
Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.
Modelling the spread of innovation in wild birds.
Shultz, Thomas R; Montrey, Marcel; Aplin, Lucy M
2017-06-01
We apply three plausible algorithms in agent-based computer simulations to recent experiments on social learning in wild birds. Although some of the phenomena are simulated by all three learning algorithms, several manifestations of social conformity bias are simulated by only the approximate majority (AM) algorithm, which has roots in chemistry, molecular biology and theoretical computer science. The simulations generate testable predictions and provide several explanatory insights into the diffusion of innovation through a population. The AM algorithm's success raises the possibility of its usefulness in studying group dynamics more generally, in several different scientific domains. Our differential-equation model matches simulation results and provides mathematical insights into the dynamics of these algorithms. © 2017 The Author(s).
AESS: Accelerated Exact Stochastic Simulation
NASA Astrophysics Data System (ADS)
Jenkins, David D.; Peterson, Gregory D.
2011-12-01
The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution method: The Accelerated Exact Stochastic Simulation (AESS) tool provides implementations of a wide variety of popular variations on the Gillespie method. Users can select the specific algorithm considered most appropriate. Comparisons between the methods and with other available implementations indicate that AESS provides the fastest known implementation of Gillespie's method for a variety of test models. Users may wish to execute ensembles of simulations to sweep parameters or to obtain better statistical results, so AESS supports acceleration of ensembles of simulation using parallel processing with MPI, SSE vector units on x86 processors, and/or using NVIDIA GPUs with CUDA.
Incorporation of a Cumulus Fraction Scheme in the GRAPES_Meso and Evaluation of Its Performance
NASA Astrophysics Data System (ADS)
Zheng, X.
2016-12-01
Accurate simulation of cloud cover fraction is a key and difficult issue in numerical modeling studies. Preliminary evaluations have indicated that cloud fraction is generally underestimated in GRAPES_Meso simulations, while the cloud fraction scheme (CFS) of ECMWF can provide more realistic results. Therefore, the ECMWF cumulus fraction scheme is introduced into GRAPES_Meso to replace the original CFS, and the model performance with the new CFS is evaluated based on simulated three-dimensional cloud fractions and surface temperature. Results indicate that the simulated cloud fractions increase and become more accurate with the new CFS; the simulation for vertical cloud structure has improved too; errors in surface temperature simulation have decreased. The above analysis and results suggest that the new CFS has a positive impact on cloud fraction and surface temperature simulation.
Flight dynamics analysis and simulation of heavy lift airships. Volume 2: Technical manual
NASA Technical Reports Server (NTRS)
Ringland, R. F.; Tischler, M. B.; Jex, H. R.; Emmen, R. D.; Ashkenas, I. L.
1982-01-01
The mathematical models embodied in the simulation are described in considerable detail and with supporting evidence for the model forms chosen. In addition the trimming and linearization algorithms used in the simulation are described. Appendices to the manual identify reference material for estimating the needed coefficients for the input data and provide example simulation results.
Multi-scale gyrokinetic simulations of an Alcator C-Mod, ELM-y H-mode plasma
NASA Astrophysics Data System (ADS)
Howard, N. T.; Holland, C.; White, A. E.; Greenwald, M.; Rodriguez-Fernandez, P.; Candy, J.; Creely, A. J.
2018-01-01
High fidelity, multi-scale gyrokinetic simulations capable of capturing both ion ({k}θ {ρ }s∼ { O }(1.0)) and electron-scale ({k}θ {ρ }e∼ { O }(1.0)) turbulence were performed in the core of an Alcator C-Mod ELM-y H-mode discharge which exhibits reactor-relevant characteristics. These simulations, performed with all experimental inputs and realistic ion to electron mass ratio ({({m}i/{m}e)}1/2=60.0) provide insight into the physics fidelity that may be needed for accurate simulation of the core of fusion reactor discharges. Three multi-scale simulations and series of separate ion and electron-scale simulations performed using the GYRO code (Candy and Waltz 2003 J. Comput. Phys. 186 545) are presented. As with earlier multi-scale results in L-mode conditions (Howard et al 2016 Nucl. Fusion 56 014004), both ion and multi-scale simulations results are compared with experimentally inferred ion and electron heat fluxes, as well as the measured values of electron incremental thermal diffusivities—indicative of the experimental electron temperature profile stiffness. Consistent with the L-mode results, cross-scale coupling is found to play an important role in the simulation of these H-mode conditions. Extremely stiff ion-scale transport is observed in these high-performance conditions which is shown to likely play and important role in the reproduction of measurements of perturbative transport. These results provide important insight into the role of multi-scale plasma turbulence in the core of reactor-relevant plasmas and establish important constraints on the the fidelity of models needed for predictive simulations.
Determination of elastomeric foam parameters for simulations of complex loading.
Petre, M T; Erdemir, A; Cavanagh, P R
2006-08-01
Finite element (FE) analysis has shown promise for the evaluation of elastomeric foam personal protection devices. Although appropriate representation of foam materials is necessary in order to obtain realistic simulation results, material definitions used in the literature vary widely and often fail to account for the multi-mode loading experienced by these devices. This study aims to provide a library of elastomeric foam material parameters that can be used in FE simulations of complex loading scenarios. Twelve foam materials used in footwear were tested in uni-axial compression, simple shear and volumetric compression. For each material, parameters for a common compressible hyperelastic material model used in FE analysis were determined using: (a) compression; (b) compression and shear data; and (c) data from all three tests. Material parameters and Drucker stability limits for the best fits are provided with their associated errors. The material model was able to reproduce deformation modes for which data was provided during parameter determination but was unable to predict behavior in other deformation modes. Simulation results were found to be highly dependent on the extent of the test data used to determine the parameters in the material definition. This finding calls into question the many published results of simulations of complex loading that use foam material parameters obtained from a single mode of testing. The library of foam parameters developed here presents associated errors in three deformation modes that should provide for a more informed selection of material parameters.
Hydrodynamic Simulations of Protoplanetary Disks with GIZMO
NASA Astrophysics Data System (ADS)
Rice, Malena; Laughlin, Greg
2018-01-01
Over the past several decades, the field of computational fluid dynamics has rapidly advanced as the range of available numerical algorithms and computationally feasible physical problems has expanded. The development of modern numerical solvers has provided a compelling opportunity to reconsider previously obtained results in search for yet undiscovered effects that may be revealed through longer integration times and more precise numerical approaches. In this study, we compare the results of past hydrodynamic disk simulations with those obtained from modern analytical resources. We focus our study on the GIZMO code (Hopkins 2015), which uses meshless methods to solve the homogeneous Euler equations of hydrodynamics while eliminating problems arising as a result of advection between grid cells. By comparing modern simulations with prior results, we hope to provide an improved understanding of the impact of fluid mechanics upon the evolution of protoplanetary disks.
Facial recognition using enhanced pixelized image for simulated visual prosthesis.
Li, Ruonan; Zhhang, Xudong; Zhang, Hui; Hu, Guanshu
2005-01-01
A simulated face recognition experiment using enhanced pixelized images is designed and performed for the artificial visual prosthesis. The results of the simulation reveal new characteristics of visual performance in an enhanced pixelization condition, and then new suggestions on the future design of visual prosthesis are provided.
Scientific Benefits of Space Science Models Archiving at Community Coordinated Modeling Center
NASA Technical Reports Server (NTRS)
Kuznetsova, Maria M.; Berrios, David; Chulaki, Anna; Hesse, Michael; MacNeice, Peter J.; Maddox, Marlo M.; Pulkkinen, Antti; Rastaetter, Lutz; Taktakishvili, Aleksandre
2009-01-01
The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the-art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. CCMC provides a web-based Run-on-Request system, by which the interested scientist can request simulations for a broad range of space science problems. To allow the models to be driven by data relevant to particular events CCMC developed a tool that automatically downloads data from data archives and transform them to required formats. CCMC also provides a tailored web-based visualization interface for the model output, as well as the capability to download the simulation output in portable format. CCMC offers a variety of visualization and output analysis tools to aid scientists in interpretation of simulation results. During eight years since the Run-on-request system became available the CCMC archived the results of almost 3000 runs that are covering significant space weather events and time intervals of interest identified by the community. The simulation results archived at CCMC also include a library of general purpose runs with modeled conditions that are used for education and research. Archiving results of simulations performed in support of several Modeling Challenges helps to evaluate the progress in space weather modeling over time. We will highlight the scientific benefits of CCMC space science model archive and discuss plans for further development of advanced methods to interact with simulation results.
Lyke, Stephen D; Voelz, David G; Roggemann, Michael C
2009-11-20
The probability density function (PDF) of aperture-averaged irradiance fluctuations is calculated from wave-optics simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to strong. Results show that under weak scintillation conditions both the gamma-gamma and lognormal PDF models provide a good fit to the simulation data for all aperture sizes studied. Our results indicate that in moderate scintillation the gamma-gamma PDF provides a better fit to the simulation data than the lognormal PDF for all aperture sizes studied. In the strong scintillation regime, the simulation data distribution is gamma gamma for aperture sizes much smaller than the coherence radius rho0 and lognormal for aperture sizes on the order of rho0 and larger. Examples of how these results affect the bit-error rate of an on-off keyed free space optical communication link are presented.
DOT National Transportation Integrated Search
1991-04-01
Results from vehicle computer simulations usually take the form of numeric data or graphs. While these graphs provide the investigator with the insight into vehicle behavior, it may be difficult to use these graphs to assess complex vehicle motion. C...
ERIC Educational Resources Information Center
Hitchen, Trevor; Metcalfe, Judith
1987-01-01
Describes a simulation of the results of real experiments which use different strains of Escherichia coli. Provides an inexpensive practical problem-solving exercise to aid the teaching and understanding of the Jacob and Monod model of gene regulation. (Author/CW)
Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis, Phase 2 Results
NASA Technical Reports Server (NTRS)
Murri, Daniel G.
2011-01-01
The NASA Engineering and Safety Center (NESC) was requested to establish the Simulation Framework for Rapid Entry, Descent, and Landing (EDL) Analysis assessment, which involved development of an enhanced simulation architecture using the Program to Optimize Simulated Trajectories II simulation tool. The assessment was requested to enhance the capability of the Agency to provide rapid evaluation of EDL characteristics in systems analysis studies, preliminary design, mission development and execution, and time-critical assessments. Many of the new simulation framework capabilities were developed to support the Agency EDL-Systems Analysis (SA) team that is conducting studies of the technologies and architectures that are required to enable human and higher mass robotic missions to Mars. The findings, observations, and recommendations from the NESC are provided in this report.
DNS of Flows over Periodic Hills using a Discontinuous-Galerkin Spectral-Element Method
NASA Technical Reports Server (NTRS)
Diosady, Laslo T.; Murman, Scott M.
2014-01-01
Direct numerical simulation (DNS) of turbulent compressible flows is performed using a higher-order space-time discontinuous-Galerkin finite-element method. The numerical scheme is validated by performing DNS of the evolution of the Taylor-Green vortex and turbulent flow in a channel. The higher-order method is shown to provide increased accuracy relative to low-order methods at a given number of degrees of freedom. The turbulent flow over a periodic array of hills in a channel is simulated at Reynolds number 10,595 using an 8th-order scheme in space and a 4th-order scheme in time. These results are validated against previous large eddy simulation (LES) results. A preliminary analysis provides insight into how these detailed simulations can be used to improve Reynoldsaveraged Navier-Stokes (RANS) modeling
NASA Astrophysics Data System (ADS)
Morozov, A.; Heindl, T.; Skrobol, C.; Wieser, J.; Krücken, R.; Ulrich, A.
2008-07-01
Electron beams with particle energy of ~10 keV were sent through 300 nm thick ceramic (Si3N4 + SiO2) foils and the resulting electron energy distribution functions were recorded using a retarding grid technique. The results are compared with Monte Carlo simulations performed with two publicly available packages, Geant4 and Casino v2.42. It is demonstrated that Geant4, unlike Casino, provides electron energy distribution functions very similar to the experimental distributions. Both simulation packages provide a quite precise average energy of transmitted electrons: we demonstrate that the maximum uncertainty of the calculated values of the average energy is 6% for Geant4 and 8% for Casino, taking into account all systematic uncertainties and the discrepancies in the experimental and simulated data.
NaCl nucleation from brine in seeded simulations: Sources of uncertainty in rate estimates.
Zimmermann, Nils E R; Vorselaars, Bart; Espinosa, Jorge R; Quigley, David; Smith, William R; Sanz, Eduardo; Vega, Carlos; Peters, Baron
2018-06-14
This work reexamines seeded simulation results for NaCl nucleation from a supersaturated aqueous solution at 298.15 K and 1 bar pressure. We present a linear regression approach for analyzing seeded simulation data that provides both nucleation rates and uncertainty estimates. Our results show that rates obtained from seeded simulations rely critically on a precise driving force for the model system. The driving force vs. solute concentration curve need not exactly reproduce that of the real system, but it should accurately describe the thermodynamic properties of the model system. We also show that rate estimates depend strongly on the nucleus size metric. We show that the rate estimates systematically increase as more stringent local order parameters are used to count members of a cluster and provide tentative suggestions for appropriate clustering criteria.
Fully dynamical simulation of central nuclear collisions.
van der Schee, Wilke; Romatschke, Paul; Pratt, Scott
2013-11-27
We present a fully dynamical simulation of central nuclear collisions around midrapidity at LHC energies. Unlike previous treatments, we simulate all phases of the collision, including the equilibration of the system. For the simulation, we use numerical relativity solutions to anti-de Sitter space/conformal field theory for the preequilibrium stage, viscous hydrodynamics for the plasma equilibrium stage, and kinetic theory for the low-density hadronic stage. Our preequilibrium stage provides initial conditions for hydrodynamics, resulting in sizable radial flow. The resulting light particle spectra reproduce the measurements from the ALICE experiment at all transverse momenta.
Notional Scoring for Technical Review Weighting As Applied to Simulation Credibility Assessment
NASA Technical Reports Server (NTRS)
Hale, Joseph Peter; Hartway, Bobby; Thomas, Danny
2008-01-01
NASA's Modeling and Simulation Standard requires a credibility assessment for critical engineering data produced by models and simulations. Credibility assessment is thus a "qualifyingfactor" in reporting results from simulation-based analysis. The degree to which assessors should be independent of the simulation developers, users and decision makers is a recurring question. This paper provides alternative "weighting algorithms" for calculating the value-added for independence of the levels of technical review defined for the NASA Modeling and Simulation Standard.
2013-09-01
which utilizes FTA and then loads it into a DES engine to generate simulation results. .......44 Figure 21. This simulation architecture is...While Discrete Event Simulation ( DES ) can provide accurate time estimation and fast simulation speed, models utilizing it often suffer...C4ISR progress in MDW is developed in this research to demonstrate the feasibility of AEMF- DES and explore its potential. The simulation (MDSIM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Ronald W.; Collins, Benjamin S.; Godfrey, Andrew T.
2016-12-09
In order to support engineering analysis of Virtual Environment for Reactor Analysis (VERA) model results, the Consortium for Advanced Simulation of Light Water Reactors (CASL) needs a tool that provides visualizations of HDF5 files that adhere to the VERAOUT specification. VERAView provides an interactive graphical interface for the visualization and engineering analyses of output data from VERA. The Python-based software provides instantaneous 2D and 3D images, 1D plots, and alphanumeric data from VERA multi-physics simulations.
Simulation test results for lift/cruise fan research and technology aircraft
NASA Technical Reports Server (NTRS)
Bland, M. P.; Konsewicz, R. K.
1976-01-01
A flight simulation program was conducted on the flight simulator for advanced aircraft (FSAA). The flight simulation was a part of a contracted effort to provide a lift/cruise fan V/STOL aircraft mathematical model for flight simulation. The simulated aircraft is a configuration of the Lift/Cruise Fan V/STOL research technology aircraft (RTA). The aircraft was powered by three gas generators driving three fans. One lift fan was installed in the nose of the aircraft, and two lift/cruise fans at the wing root. The thrust of these fans was modulated to provide pitch and roll control, and vectored to provide yaw, side force control, and longitudinal translation. Two versions of the RTA were defined. One was powered by the GE J97/LF460 propulsion system which was gas-coupled for power transfer between fans for control. The other version was powered by DDA XT701 gas generators driving 62 inch variable pitch fans. The flight control system in both versions of the RTA was the same.
NASA Technical Reports Server (NTRS)
Early, Derrick A.; Haile, William B.; Turczyn, Mark T.; Griffin, Thomas J. (Technical Monitor)
2001-01-01
NASA Goddard Space Flight Center and the European Space Agency (ESA) conducted a disturbance verification test on a flight Solar Array 3 (SA3) for the Hubble Space Telescope using the ESA Large Space Simulator (LSS) in Noordwijk, the Netherlands. The LSS cyclically illuminated the SA3 to simulate orbital temperature changes in a vacuum environment. Data acquisition systems measured signals from force transducers and accelerometers resulting from thermally induced vibrations of the SAI The LSS with its seismic mass boundary provided an excellent background environment for this test. This paper discusses the analysis performed on the measured transient SA3 responses and provides a summary of the results.
NASA Technical Reports Server (NTRS)
Scaffidi, C. A.; Stocklin, F. J.; Feldman, M. B.
1971-01-01
An L-band telemetry system designed to provide the capability of near-real-time processing of calibration data is described. The system also provides the capability of performing computerized spacecraft simulations, with the aircraft as a data source, and evaluating the network response. The salient characteristics of a telemetry analysis and simulation program (TASP) are discussed, together with the results of TASP testing. The results of the L-band system testing have successfully demonstrated the capability of near-real-time processing of telemetry test data, the control of the ground-received signal to within + or - 0.5 db, and the computer generation of test signals.
Ensemble Simulation of the Atmospheric Radionuclides Discharged by the Fukushima Nuclear Accident
NASA Astrophysics Data System (ADS)
Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru
2013-04-01
Enormous amounts of radionuclides were discharged into the atmosphere by a nuclear accident at the Fukushima Daiichi nuclear power plant (FDNPP) after the earthquake and tsunami on 11 March 2011. The radionuclides were dispersed from the power plant and deposited mainly over eastern Japan and the North Pacific Ocean. A lot of numerical simulations of the radionuclide dispersion and deposition had been attempted repeatedly since the nuclear accident. However, none of them were able to perfectly simulate the distribution of dose rates observed after the accident over eastern Japan. This was partly due to the error of the wind vectors and precipitations used in the numerical simulations; unfortunately, their deterministic simulations could not deal with the probability distribution of the simulation results and errors. Therefore, an ensemble simulation of the atmospheric radionuclides was performed using the ensemble Kalman filter (EnKF) data assimilation system coupled with the Japan Meteorological Agency (JMA) non-hydrostatic mesoscale model (NHM); this mesoscale model has been used operationally for daily weather forecasts by JMA. Meteorological observations were provided to the EnKF data assimilation system from the JMA operational-weather-forecast dataset. Through this ensemble data assimilation, twenty members of the meteorological analysis over eastern Japan from 11 to 31 March 2011 were successfully obtained. Using these meteorological ensemble analysis members, the radionuclide behavior in the atmosphere such as advection, convection, diffusion, dry deposition, and wet deposition was simulated. This ensemble simulation provided the multiple results of the radionuclide dispersion and distribution. Because a large ensemble deviation indicates the low accuracy of the numerical simulation, the probabilistic information is obtainable from the ensemble simulation results. For example, the uncertainty of precipitation triggered the uncertainty of wet deposition; the uncertainty of wet deposition triggered the uncertainty of atmospheric radionuclide amounts. Then the remained radionuclides were transported downwind; consequently the uncertainty signal of the radionuclide amounts was propagated downwind. The signal propagation was seen in the ensemble simulation by the tracking of the large deviation areas of radionuclide concentration and deposition. These statistics are able to provide information useful for the probabilistic prediction of radionuclides.
Temporal Evolution of the Plasma Sheath Surrounding Solar Cells in Low Earth Orbit
NASA Technical Reports Server (NTRS)
Willis, Emily M.; Pour, Maria Z. A.
2017-01-01
Initial results from the PIC simulation and the LEM simulation have been presented. The PIC simulation results show that more detailed study is required to refine the ISS solar array current collection model and to understand the development of the current collection in time. The initial results from the LEM demonstrate that is it possible the transients are caused by solar array interaction with the environment, but there are presently too many assumptions in the model to be certain. Continued work on the PIC simulation will provide valuable information on the development of the barrier potential, which will allow refinement the LEM simulation and a better understanding of the causes and effects of the transients.
Rover Attitude and Pointing System Simulation Testbed
NASA Technical Reports Server (NTRS)
Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam
2009-01-01
The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.
Assessment of simulation fidelity using measurements of piloting technique in flight. II
NASA Technical Reports Server (NTRS)
Ferguson, S. W.; Clement, W. F.; Hoh, R. H.; Cleveland, W. B.
1985-01-01
Two components of the Vertical Motion Simulator (presently being used to assess the fidelity of UH-60A simulation) are evaluated: (1) the dash/quickstop Nap-of-the-earth (NOE) piloting task, and (2) the bop-up task. Data from these two flight test experiments are presented which provide information on the effect of reduced visual field of view, variation in scene content and texture, and the affect of pure time delay in the closed-loop pilot response. In comparison with task performance results obtained in flight tests, the results from the simulation indicate that the pilot's NOE task performance in the simulator is significantly degraded.
Brownfield Action: An Integrated Environmental Science Simulation Experience for Undergraduates.
ERIC Educational Resources Information Center
Kelsey, Ryan
This paper presents the results of three years of development and evaluation of a CD-ROM/Web hybrid simulation known as Brownfield Action for an introductory environmental science course at an independent college for women in the northeastern United States. Brownfield Action is a simulation that provides a learning environment for developing the…
Development of NASA's Models and Simulations Standard
NASA Technical Reports Server (NTRS)
Bertch, William J.; Zang, Thomas A.; Steele, Martin J.
2008-01-01
From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.
ERIC Educational Resources Information Center
Stefani, R. T.
This document describes the design of an automatic guidance and control system for a passenger car. A simulation of that system is presented. Analog outputs are provided which compare human operator control to automatic control. One human control possibility is to provide the operator with sufficient feedback information that resulting performance…
Regan, R.S.; Schaffranek, R.W.; Baltzer, R.A.
1996-01-01
A system of functional utilities and computer routines, collectively identified as the Time-Dependent Data System CI DDS), has been developed and documented by the U.S. Geological Survey. The TDDS is designed for processing time sequences of discrete, fixed-interval, time-varying geophysical data--in particular, hydrologic data. Such data include various, dependent variables and related parameters typically needed as input for execution of one-, two-, and three-dimensional hydrodynamic/transport and associated water-quality simulation models. Such data can also include time sequences of results generated by numerical simulation models. Specifically, TDDS provides the functional capabilities to process, store, retrieve, and compile data in a Time-Dependent Data Base (TDDB) in response to interactive user commands or pre-programmed directives. Thus, the TDDS, in conjunction with a companion TDDB, provides a ready means for processing, preparation, and assembly of time sequences of data for input to models; collection, categorization, and storage of simulation results from models; and intercomparison of field data and simulation results. The TDDS can be used to edit and verify prototype, time-dependent data to affirm that selected sequences of data are accurate, contiguous, and appropriate for numerical simulation modeling. It can be used to prepare time-varying data in a variety of formats, such as tabular lists, sequential files, arrays, graphical displays, as well as line-printer plots of single or multiparameter data sets. The TDDB is organized and maintained as a direct-access data base by the TDDS, thus providing simple, yet efficient, data management and access. A single, easily used, program interface that provides all access to and from a particular TDDB is available for use directly within models, other user-provided programs, and other data systems. This interface, together with each major functional utility of the TDDS, is described and documented in this report.
Simulation of Lunar Surface Communications Network Exploration Scenarios
NASA Technical Reports Server (NTRS)
Linsky, Thomas W.; Bhasin, Kul B.; White, Alex; Palangala, Srihari
2006-01-01
Simulations and modeling of surface-based communications networks provides a rapid and cost effective means of requirement analysis, protocol assessments, and tradeoff studies. Robust testing in especially important for exploration systems, where the cost of deployment is high and systems cannot be easily replaced or repaired. However, simulation of the envisioned exploration networks cannot be achieved using commercial off the shelf network simulation software. Models for the nonstandard, non-COTS protocols used aboard space systems are not readily available. This paper will address the simulation of realistic scenarios representative of the activities which will take place on the surface of the Moon, including selection of candidate network architectures, and the development of an integrated simulation tool using OPNET modeler capable of faithfully modeling those communications scenarios in the variable delay, dynamic surface environments. Scenarios for exploration missions, OPNET development, limitations, and simulations results will be provided and discussed.
Simulation of the XV-15 tilt rotor research aircraft
NASA Technical Reports Server (NTRS)
Churchill, G. B.; Dugan, D. C.
1982-01-01
The effective use of simulation from issuance of the request for proposal through conduct of a flight test program for the XV-15 Tilt Rotor Research Aircraft is discussed. From program inception, simulation complemented all phases of XV-15 development. The initial simulation evaluations during the source evaluation board proceedings contributed significantly to performance and stability and control evaluations. Eight subsequent simulation periods provided major contributions in the areas of control concepts; cockpit configuration; handling qualities; pilot workload; failure effects and recovery procedures; and flight boundary problems and recovery procedures. The fidelity of the simulation also made it a valuable pilot training aid, as well as a suitable tool for military and civil mission evaluations. Simulation also provided valuable design data for refinement of automatic flight control systems. Throughout the program, fidelity was a prime issue and resulted in unique data and methods for fidelity evaluation which are presented and discussed.
NASA Astrophysics Data System (ADS)
Mateo, Cherry May R.; Yamazaki, Dai; Kim, Hyungjun; Champathong, Adisorn; Vaze, Jai; Oki, Taikan
2017-10-01
Global-scale river models (GRMs) are core tools for providing consistent estimates of global flood hazard, especially in data-scarce regions. Due to former limitations in computational power and input datasets, most GRMs have been developed to use simplified representations of flow physics and run at coarse spatial resolutions. With increasing computational power and improved datasets, the application of GRMs to finer resolutions is becoming a reality. To support development in this direction, the suitability of GRMs for application to finer resolutions needs to be assessed. This study investigates the impacts of spatial resolution and flow connectivity representation on the predictive capability of a GRM, CaMa-Flood, in simulating the 2011 extreme flood in Thailand. Analyses show that when single downstream connectivity (SDC) is assumed, simulation results deteriorate with finer spatial resolution; Nash-Sutcliffe efficiency coefficients decreased by more than 50 % between simulation results at 10 km resolution and 1 km resolution. When multiple downstream connectivity (MDC) is represented, simulation results slightly improve with finer spatial resolution. The SDC simulations result in excessive backflows on very flat floodplains due to the restrictive flow directions at finer resolutions. MDC channels attenuated these effects by maintaining flow connectivity and flow capacity between floodplains in varying spatial resolutions. While a regional-scale flood was chosen as a test case, these findings should be universal and may have significant impacts on large- to global-scale simulations, especially in regions where mega deltas exist.These results demonstrate that a GRM can be used for higher resolution simulations of large-scale floods, provided that MDC in rivers and floodplains is adequately represented in the model structure.
Prediction of helicopter simulator sickness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Horn, R.D.; Birdwell, J.D.; Allgood, G.O.
1990-01-01
Machine learning methods from artificial intelligence are used to identify information in sampled accelerometer signals and associative behavioral patterns which correlates pilot simulator sickness with helicopter simulator dynamics. These simulators are used to train pilots in fundamental procedures, tactics, and response to emergency conditions. Simulator sickness induced by these systems represents a risk factor to both the pilot and manufacturer. Simulator sickness symptoms are closely aligned with those of motion sickness. Previous studies have been performed by behavioral psychologists using information gathered with surveys and motor skills performance measures; however, the results are constrained by the limited information which ismore » accessible in this manner. In this work, accelerometers were installed in the simulator cab, enabling a complete record of flight dynamics and the pilot's control response as a function of time. Given the results of performance measures administered to detect simulator sickness symptoms, the problem was then to find functions of the recorded data which could be used to help predict the simulator sickness level and susceptibility. Methods based upon inductive inference were used, which yield decision trees whose leaves indicate the degree of simulator-induced sickness. The long-term goal is to develop a gauge'' which can provide an on-line prediction of simulator sickness level, given a pilot's associative behavioral patterns (learned expectations). This will allow informed decisions to be made on when to terminate a hop and provide an effective basis for determining training and flight restrictions placed upon the pilot after simulator use. 6 refs., 6 figs.« less
Simulating Snow in Canadian Boreal Environments with CLASS for ESM-SnowMIP
NASA Astrophysics Data System (ADS)
Wang, L.; Bartlett, P. A.; Derksen, C.; Ireson, A. M.; Essery, R.
2017-12-01
The ability of land surface schemes to provide realistic simulations of snow cover is necessary for accurate representation of energy and water balances in climate models. Historically, this has been particularly challenging in boreal forests, where poor treatment of both snow masking by forests and vegetation-snow interaction has resulted in biases in simulated albedo and snowpack properties, with subsequent effects on both regional temperatures and the snow albedo feedback in coupled simulations. The SnowMIP (Snow Model Intercomparison Project) series of experiments or `MIPs' was initiated in order to provide assessments of the performance of various snow- and land-surface-models at selected locations, in order to understand the primary factors affecting model performance. Here we present preliminary results of simulations conducted for the third such MIP, ESM-SnowMIP (Earth System Model - Snow Model Intercomparison Project), using the Canadian Land Surface Scheme (CLASS) at boreal forest sites in central Saskatchewan. We assess the ability of our latest model version (CLASS 3.6.2) to simulate observed snowpack properties (snow water equivalent, density and depth) and above-canopy albedo over 13 winters. We also examine the sensitivity of these simulations to climate forcing at local and regional scales.
High Fidelity Thermal Simulators for Non-Nuclear Testing: Analysis and Initial Results
NASA Technical Reports Server (NTRS)
Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David
2007-01-01
Non-nuclear testing can be a valuable tool in the development of a space nuclear power system, providing system characterization data and allowing one to work through various fabrication, assembly and integration issues without the cost and time associated with a full ground nuclear test. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Testing with non-optimized heater elements allows one to assess thermal, heat transfer, and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. High fidelity thermal simulators that match both the static and the dynamic fuel pin performance that would be observed in an operating, fueled nuclear reactor can vastly increase the value of non-nuclear test results. With optimized simulators, the integration of thermal hydraulic hardware tests with simulated neutronie response provides a bridge between electrically heated testing and fueled nuclear testing, providing a better assessment of system integration issues, characterization of integrated system response times and response characteristics, and assessment of potential design improvements' at a relatively small fiscal investment. Initial conceptual thermal simulator designs are determined by simple one-dimensional analysis at a single axial location and at steady state conditions; feasible concepts are then input into a detailed three-dimensional model for comparison to expected fuel pin performance. Static and dynamic fuel pin performance for a proposed reactor design is determined using SINDA/FLUINT thermal analysis software, and comparison is made between the expected nuclear performance and the performance of conceptual thermal simulator designs. Through a series of iterative analyses, a conceptual high fidelity design can developed. Test results presented in this paper correspond to a "first cut" simulator design for a potential liquid metal (NaK) cooled reactor design that could be applied for Lunar surface power. Proposed refinements to this simulator design are also presented.
NASA Technical Reports Server (NTRS)
Jani, Yashvant
1992-01-01
As part of the RICIS activity, the reinforcement learning techniques developed at Ames Research Center are being applied to proximity and docking operations using the Shuttle and Solar Max satellite simulation. This activity is carried out in the software technology laboratory utilizing the Orbital Operations Simulator (OOS). This report is deliverable D2 Altitude Control Results and provides the status of the project after four months of activities and outlines the future plans. In section 2 we describe the Fuzzy-Learner system for the attitude control functions. In section 3, we provide the description of test cases and results in a chronological order. In section 4, we have summarized our results and conclusions. Our future plans and recommendations are provided in section 5.
Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.
Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime
2017-10-01
Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel ® . The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.
An inter-institutional collaboration: transforming education through interprofessional simulations.
King, Sharla; Drummond, Jane; Hughes, Ellen; Bookhalter, Sharon; Huffman, Dan; Ansell, Dawn
2013-09-01
An inter-institutional partnership of four post-secondary institutions and a health provider formed a learning community with the goal of developing, implementing and evaluating interprofessional learning experiences in simulation-based environments. The organization, education and educational research activities of the learning community align with the institutional and instructional reforms recommended by the Lancet Commission on Health Professional Education for the 21st century. This article provides an overview of the inter-institutional collaboration, including the interprofessional simulation learning experiences, instructor development activities and preliminary results from the evaluation.
Application of CFE/POST2 for Simulation of Launch Vehicle Stage Separation
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Tartabini, Paul V.; Toniolo, Matthew D.; Roithmayr, Carlos M.; Karlgaard, Christopher D.; Samareh, Jamshid A.
2009-01-01
The constraint force equation (CFE) methodology provides a framework for modeling constraint forces and moments acting at joints that connect multiple vehicles. With implementation in Program to Optimize Simulated Trajectories II (POST 2), the CFE provides a capability to simulate end-to-end trajectories of launch vehicles, including stage separation. In this paper, the CFE/POST2 methodology is applied to the Shuttle-SRB separation problem as a test and validation case. The CFE/POST2 results are compared with STS-1 flight test data.
Hyper-X Stage Separation: Simulation Development and Results
NASA Technical Reports Server (NTRS)
Reubush, David E.; Martin, John G.; Robinson, Jeffrey S.; Bose, David M.; Strovers, Brian K.
2001-01-01
This paper provides an overview of stage separation simulation development and results for NASA's Hyper-X program; a focused hypersonic technology effort designed to move hypersonic, airbreathing vehicle technology from the laboratory environment to the flight environment. This paper presents an account of the development of the current 14 degree of freedom stage separation simulation tool (SepSim) and results from use of the tool in a Monte Carlo analysis to evaluate the risk of failure for the separation event. Results from use of the tool show that there is only a very small risk of failure in the separation event.
NASA Astrophysics Data System (ADS)
van Poppel, Bret; Owkes, Mark; Nelson, Thomas; Lee, Zachary; Sowell, Tyler; Benson, Michael; Vasquez Guzman, Pablo; Fahrig, Rebecca; Eaton, John; Kurman, Matthew; Kweon, Chol-Bum; Bravo, Luis
2014-11-01
In this work, we present high-fidelity Computational Fluid Dynamics (CFD) results of liquid fuel injection from a pressure-swirl atomizer and compare the simulations to experimental results obtained using both shadowgraphy and phase-averaged X-ray computed tomography (CT) scans. The CFD and experimental results focus on the dense near-nozzle region to identify the dominant mechanisms of breakup during primary atomization. Simulations are performed using the NGA code of Desjardins et al (JCP 227 (2008)) and employ the volume of fluid (VOF) method proposed by Owkes and Desjardins (JCP 270 (2013)), a second order accurate, un-split, conservative, three-dimensional VOF scheme providing second order density fluxes and capable of robust and accurate high density ratio simulations. Qualitative features and quantitative statistics are assessed and compared for the simulation and experimental results, including the onset of atomization, spray cone angle, and drop size and distribution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robertson, Sean; Dewan, Leslie; Massie, Mark
This report presents results from a collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear (GAIN) Nuclear Energy Voucher program. The TAP concept is a molten salt reactor using configurable zirconium hydride moderator rod assemblies to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches and time-dependent parametersmore » necessary to simulate the continuously changing physics in this complex system. The implementation of continuous-energy Monte Carlo transport and depletion tools in ChemTriton provide for full-core three-dimensional modeling and simulation. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this concept. Additional analyses of mass feed rates and enrichments, isotopic removals, tritium generation, core power distribution, core vessel helium generation, moderator rod heat deposition, and reactivity coeffcients provide additional information to make informed design decisions. This work demonstrates capabilities of ORNL modeling and simulation tools for neutronic and fuel cycle analysis of molten salt reactor concepts.« less
NASA Technical Reports Server (NTRS)
Baurle, R. A.
2015-01-01
Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. The cases simulated corresponded to those used to examine this flowfield experimentally using particle image velocimetry. A variety of turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged / large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This effort was undertaken to formally assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community. The numerical errors were quantified for both the steady-state and scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results showed a high degree of variability when comparing the predictions obtained from each turbulence model, with the non-linear eddy viscosity model (an explicit algebraic stress model) providing the most accurate prediction of the measured values. The hybrid Reynolds-averaged/large eddy simulation results were carefully scrutinized to ensure that even the coarsest grid had an acceptable level of resolution for large eddy simulation, and that the time-averaged statistics were acceptably accurate. The autocorrelation and its Fourier transform were the primary tools used for this assessment. The statistics extracted from the hybrid simulation strategy proved to be more accurate than the Reynolds-averaged results obtained using the linear eddy viscosity models. However, there was no predictive improvement noted over the results obtained from the explicit Reynolds stress model. Fortunately, the numerical error assessment at most of the axial stations used to compare with measurements clearly indicated that the scale-resolving simulations were improving (i.e. approaching the measured values) as the grid was refined. Hence, unlike a Reynolds-averaged simulation, the hybrid approach provides a mechanism to the end-user for reducing model-form errors.
Patel, Archita D.; Meurer, David A.; Shuster, Jonathan J.
2016-01-01
Introduction. Limited evidence is available on simulation training of prehospital care providers, specifically the use of tourniquets and needle decompression. This study focused on whether the confidence level of prehospital personnel performing these skills improved through simulation training. Methods. Prehospital personnel from Alachua County Fire Rescue were enrolled in the study over a 2- to 3-week period based on their availability. Two scenarios were presented to them: a motorcycle crash resulting in a leg amputation requiring a tourniquet and an intoxicated patient with a stab wound, who experienced tension pneumothorax requiring needle decompression. Crews were asked to rate their confidence levels before and after exposure to the scenarios. Timing of the simulation interventions was compared with actual scene times to determine applicability of simulation in measuring the efficiency of prehospital personnel. Results. Results were collected from 129 participants. Pre- and postexposure scores increased by a mean of 1.15 (SD 1.32; 95% CI, 0.88–1.42; P < 0.001). Comparison of actual scene times with simulated scene times yielded a 1.39-fold difference (95% CI, 1.25–1.55) for Scenario 1 and 1.59 times longer for Scenario 2 (95% CI, 1.43–1.77). Conclusion. Simulation training improved prehospital care providers' confidence level in performing two life-saving procedures. PMID:27563467
A new equilibrium torus solution and GRMHD initial conditions
NASA Astrophysics Data System (ADS)
Penna, Robert F.; Kulkarni, Akshay; Narayan, Ramesh
2013-11-01
Context. General relativistic magnetohydrodynamic (GRMHD) simulations are providing influential models for black hole spin measurements, gamma ray bursts, and supermassive black hole feedback. Many of these simulations use the same initial condition: a rotating torus of fluid in hydrostatic equilibrium. A persistent concern is that simulation results sometimes depend on arbitrary features of the initial torus. For example, the Bernoulli parameter (which is related to outflows), appears to be controlled by the Bernoulli parameter of the initial torus. Aims: In this paper, we give a new equilibrium torus solution and describe two applications for the future. First, it can be used as a more physical initial condition for GRMHD simulations than earlier torus solutions. Second, it can be used in conjunction with earlier torus solutions to isolate the simulation results that depend on initial conditions. Methods: We assume axisymmetry, an ideal gas equation of state, constant entropy, and ignore self-gravity. We fix an angular momentum distribution and solve the relativistic Euler equations in the Kerr metric. Results: The Bernoulli parameter, rotation rate, and geometrical thickness of the torus can be adjusted independently. Our torus tends to be more bound and have a larger radial extent than earlier torus solutions. Conclusions: While this paper was in preparation, several GRMHD simulations appeared based on our equilibrium torus. We believe it will continue to provide a more realistic starting point for future simulations.
Computer Simulation Shows the Effect of Communication on Day of Surgery Patient Flow.
Taaffe, Kevin; Fredendall, Lawrence; Huynh, Nathan; Franklin, Jennifer
2015-07-01
To improve patient flow in a surgical environment, practitioners and academicians often use process mapping and simulation as tools to evaluate and recommend changes. We used simulations to help staff visualize the effect of communication and coordination delays that occur on the day of surgery. Perioperative services staff participated in tabletop exercises in which they chose the delays that were most important to eliminate. Using a day-of-surgery computer simulation model, the elimination of delays was tested and the results were shared with the group. This exercise, repeated for multiple groups of staff, provided an understanding of not only the dynamic events taking place, but also how small communication delays can contribute to a significant loss in efficiency and the ability to provide timely care. Survey results confirmed these understandings. Copyright © 2015 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Analysis and Comparison on the Flood Simulation in Typical Hilly & Semi-mountainous Region
NASA Astrophysics Data System (ADS)
Luan, Qinghua; Wang, Dong; Zhang, Xiang; Liu, Jiahong; Fu, Xiaoran; Zhang, Kun; Ma, Jun
2017-12-01
Water-logging and flood are both serious in hilly and semi-mountainous cities of China, but the related research is rare. Lincheng Economic Development Zone (EDZ) in Hebei Province as the typical city was selected and storm water management model (SWMM) was applied for flood simulation in this study. The regional model was constructed through calibrating and verifying the runoff coefficient of different flood processes. Different designed runoff processes in five-year, ten-year and twenty-year return periods in basic scenario and in the low impact development (LID) scenario, respectively, were simulated and compared. The result shows that: LID measures have effect on peak reduction in the study area, but the effectiveness is not significant; the effectiveness of lagging peak time is poor. These simulation results provide decision support for the rational construction of LID in the study area, and provide the references for regional rain flood management.
Simulation verification techniques study. Subsystem simulation validation techniques
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1974-01-01
Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.
Modeling, simulation, and control of an extraterrestrial oxygen production plant
NASA Technical Reports Server (NTRS)
Schooley, L.; Cellier, F.; Zeigler, B.; Doser, A.; Farrenkopf, G.
1991-01-01
The immediate objective is the development of a new methodology for simulation of process plants used to produce oxygen and/or other useful materials from local planetary resources. Computer communication, artificial intelligence, smart sensors, and distributed control algorithms are being developed and implemented so that the simulation or an actual plant can be controlled from a remote location. The ultimate result of this research will provide the capability for teleoperation of such process plants which may be located on Mars, Luna, an asteroid, or other objects in space. A very useful near-term result will be the creation of an interactive design tool, which can be used to create and optimize the process/plant design and the control strategy. This will also provide a vivid, graphic demonstration mechanism to convey the results of other researchers to the sponsor.
A novel, highly efficient cavity backshort design for far-infrared TES detectors
NASA Astrophysics Data System (ADS)
Bracken, C.; de Lange, G.; Audley, M. D.; Trappe, N.; Murphy, J. A.; Gradziel, M.; Vreeling, W.-J.; Watson, D.
2018-03-01
In this paper we present a new cavity backshort design for TES (transition edge sensor) detectors which will provide increased coupling of the incoming astronomical signal to the detectors. The increased coupling results from the improved geometry of the cavities, where the geometry is a consequence of the proposed chemical etching manufacturing technique. Using a number of modelling techniques, predicted results of the performance of the cavities for frequencies of 4.3-10 THz are presented and compared to more standard cavity designs. Excellent optical efficiency is demonstrated, with improved response flatness across the band. In order to verify the simulated results, a scaled model cavity was built for testing at the lower W-band frequencies (75-100 GHz) with a VNA system. Further testing of the scale model at THz frequencies was carried out using a globar and bolometer via an FTS measurement set-up. The experimental results are presented, and compared to the simulations. Although there is relatively poor comparison between simulation and measurement at some frequencies, the discrepancies are explained by means of higher-mode excitation in the measured cavity which are not accounted for in the single-mode simulations. To verify this assumption, a better behaved cylindrical cavity is simulated and measured, where excellent agreement is demonstrated in those results. It can be concluded that both the simulations and the supporting measurements give confidence that this novel cavity design will indeed provide much-improved optical coupling for TES detectors in the far-infrared/THz band.
Accelerating 3D Hall MHD Magnetosphere Simulations with Graphics Processing Units
NASA Astrophysics Data System (ADS)
Bard, C.; Dorelli, J.
2017-12-01
The resolution required to simulate planetary magnetospheres with Hall magnetohydrodynamics result in program sizes approaching several hundred million grid cells. These would take years to run on a single computational core and require hundreds or thousands of computational cores to complete in a reasonable time. However, this requires access to the largest supercomputers. Graphics processing units (GPUs) provide a viable alternative: one GPU can do the work of roughly 100 cores, bringing Hall MHD simulations of Ganymede within reach of modest GPU clusters ( 8 GPUs). We report our progress in developing a GPU-accelerated, three-dimensional Hall magnetohydrodynamic code and present Hall MHD simulation results for both Ganymede (run on 8 GPUs) and Mercury (56 GPUs). We benchmark our Ganymede simulation with previous results for the Galileo G8 flyby, namely that adding the Hall term to ideal MHD simulations changes the global convection pattern within the magnetosphere. Additionally, we present new results for the G1 flyby as well as initial results from Hall MHD simulations of Mercury and compare them with the corresponding ideal MHD runs.
Mechanical Analysis of W78/88-1 Life Extension Program Warhead Design Options
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spencer, Nathan
2014-09-01
Life Extension Program (LEP) is a program to repair/replace components of nuclear weapons to ensure the ability to meet military requirements. The W78/88-1 LEP encompasses the modernization of two major nuclear weapon reentry systems into an interoperable warhead. Several design concepts exist to provide different options for robust safety and security themes, maximum non-nuclear commonality, and cost. Simulation is one capability used to evaluate the mechanical performance of the designs in various operational environments, plan for system and component qualification efforts, and provide insight into the survivability of the warhead in environments that are not currently testable. The simulation effortsmore » use several Sandia-developed tools through the Advanced Simulation and Computing program, including Cubit for mesh generation, the DART Model Manager, SIERRA codes running on the HPC TLCC2 platforms, DAKOTA, and ParaView. Several programmatic objectives were met using the simulation capability including: (1) providing early environmental specification estimates that may be used by component designers to understand the severity of the loads their components will need to survive, (2) providing guidance for load levels and configurations for subassembly tests intended to represent operational environments, and (3) recommending design options including modified geometry and material properties. These objectives were accomplished through regular interactions with component, system, and test engineers while using the laboratory's computational infrastructure to effectively perform ensembles of simulations. Because NNSA has decided to defer the LEP program, simulation results are being documented and models are being archived for future reference. However, some advanced and exploratory efforts will continue to mature key technologies, using the results from these and ongoing simulations for design insights, test planning, and model validation.« less
WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNamara, A; Held, K; Paganetti, H
2016-06-15
Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less
Operating system for a real-time multiprocessor propulsion system simulator
NASA Technical Reports Server (NTRS)
Cole, G. L.
1984-01-01
The success of the Real Time Multiprocessor Operating System (RTMPOS) in the development and evaluation of experimental hardware and software systems for real time interactive simulation of air breathing propulsion systems was evaluated. The Real Time Multiprocessor Operating System (RTMPOS) provides the user with a versatile, interactive means for loading, running, debugging and obtaining results from a multiprocessor based simulator. A front end processor (FEP) serves as the simulator controller and interface between the user and the simulator. These functions are facilitated by the RTMPOS which resides on the FEP. The RTMPOS acts in conjunction with the FEP's manufacturer supplied disk operating system that provides typical utilities like an assembler, linkage editor, text editor, file handling services, etc. Once a simulation is formulated, the RTMPOS provides for engineering level, run time operations such as loading, modifying and specifying computation flow of programs, simulator mode control, data handling and run time monitoring. Run time monitoring is a powerful feature of RTMPOS that allows the user to record all actions taken during a simulation session and to receive advisories from the simulator via the FEP. The RTMPOS is programmed mainly in PASCAL along with some assembly language routines. The RTMPOS software is easily modified to be applicable to hardware from different manufacturers.
Developing a Theory-Based Simulation Educator Resource.
Thomas, Christine M; Sievers, Lisa D; Kellgren, Molly; Manning, Sara J; Rojas, Deborah E; Gamblian, Vivian C
2015-01-01
The NLN Leadership Development Program for Simulation Educators 2014 faculty development group identified a lack of a common language/terminology to outline the progression of expertise of simulation educators. The group analyzed Benner's novice-to-expert model and applied its levels of experience to simulation educator growth. It established common operational categories of faculty development and used them to organize resources that support progression toward expertise. The resulting theory-based Simulator Educator Toolkit outlines levels of ability and provides quality resources to meet the diverse needs of simulation educators and team members.
Azarnoush, Hamed; Siar, Samaneh; Sawaya, Robin; Zhrani, Gmaan Al; Winkler-Schwartz, Alexander; Alotaibi, Fahad Eid; Bugdadi, Abdulgadir; Bajunaid, Khalid; Marwa, Ibrahim; Sabbagh, Abdulrahman Jafar; Del Maestro, Rolando F
2017-07-01
OBJECTIVE Virtual reality simulators allow development of novel methods to analyze neurosurgical performance. The concept of a force pyramid is introduced as a Tier 3 metric with the ability to provide visual and spatial analysis of 3D force application by any instrument used during simulated tumor resection. This study was designed to answer 3 questions: 1) Do study groups have distinct force pyramids? 2) Do handedness and ergonomics influence force pyramid structure? 3) Are force pyramids dependent on the visual and haptic characteristics of simulated tumors? METHODS Using a virtual reality simulator, NeuroVR (formerly NeuroTouch), ultrasonic aspirator force application was continually assessed during resection of simulated brain tumors by neurosurgeons, residents, and medical students. The participants performed simulated resections of 18 simulated brain tumors with different visual and haptic characteristics. The raw data, namely, coordinates of the instrument tip as well as contact force values, were collected by the simulator. To provide a visual and qualitative spatial analysis of forces, the authors created a graph, called a force pyramid, representing force sum along the z-coordinate for different xy coordinates of the tool tip. RESULTS Sixteen neurosurgeons, 15 residents, and 84 medical students participated in the study. Neurosurgeon, resident and medical student groups displayed easily distinguishable 3D "force pyramid fingerprints." Neurosurgeons had the lowest force pyramids, indicating application of the lowest forces, followed by resident and medical student groups. Handedness, ergonomics, and visual and haptic tumor characteristics resulted in distinct well-defined 3D force pyramid patterns. CONCLUSIONS Force pyramid fingerprints provide 3D spatial assessment displays of instrument force application during simulated tumor resection. Neurosurgeon force utilization and ergonomic data form a basis for understanding and modulating resident force application and improving patient safety during tumor resection.
Proposal of a micromagnetic standard problem for ferromagnetic resonance simulations
NASA Astrophysics Data System (ADS)
Baker, Alexander; Beg, Marijan; Ashton, Gregory; Albert, Maximilian; Chernyshenko, Dmitri; Wang, Weiwei; Zhang, Shilei; Bisotti, Marc-Antonio; Franchin, Matteo; Hu, Chun Lian; Stamps, Robert; Hesjedal, Thorsten; Fangohr, Hans
2017-01-01
Nowadays, micromagnetic simulations are a common tool for studying a wide range of different magnetic phenomena, including the ferromagnetic resonance. A technique for evaluating reliability and validity of different micromagnetic simulation tools is the simulation of proposed standard problems. We propose a new standard problem by providing a detailed specification and analysis of a sufficiently simple problem. By analyzing the magnetization dynamics in a thin permalloy square sample, triggered by a well defined excitation, we obtain the ferromagnetic resonance spectrum and identify the resonance modes via Fourier transform. Simulations are performed using both finite difference and finite element numerical methods, with OOMMF and Nmag simulators, respectively. We report the effects of initial conditions and simulation parameters on the character of the observed resonance modes for this standard problem. We provide detailed instructions and code to assist in using the results for evaluation of new simulator tools, and to help with numerical calculation of ferromagnetic resonance spectra and modes in general.
A Lunar Surface Operations Simulator
NASA Technical Reports Server (NTRS)
Nayar, H.; Balaram, J.; Cameron, J.; Jain, A.; Lim, C.; Mukherjee, R.; Peters, S.; Pomerantz, M.; Reder, L.; Shakkottai, P.;
2008-01-01
The Lunar Surface Operations Simulator (LSOS) is being developed to support planning and design of space missions to return astronauts to the moon. Vehicles, habitats, dynamic and physical processes and related environment systems are modeled and simulated in LSOS to assist in the visualization and design optimization of systems for lunar surface operations. A parametric analysis tool and a data browser were also implemented to provide an intuitive interface to run multiple simulations and review their results. The simulator and parametric analysis capability are described in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keller, J.; Lacava, W.; Austin, J.
2015-02-01
This work investigates the minimum level of fidelity required to accurately simulate wind turbine gearboxes using state-of-the-art design tools. Excessive model fidelity including drivetrain complexity, gearbox complexity, excitation sources, and imperfections, significantly increases computational time, but may not provide a commensurate increase in the value of the results. Essential designparameters are evaluated, including the planetary load-sharing factor, gear tooth load distribution, and sun orbit motion. Based on the sensitivity study results, recommendations for the minimum model fidelities are provided.
The implementation of sea ice model on a regional high-resolution scale
NASA Astrophysics Data System (ADS)
Prasad, Siva; Zakharov, Igor; Bobby, Pradeep; McGuire, Peter
2015-09-01
The availability of high-resolution atmospheric/ocean forecast models, satellite data and access to high-performance computing clusters have provided capability to build high-resolution models for regional ice condition simulation. The paper describes the implementation of the Los Alamos sea ice model (CICE) on a regional scale at high resolution. The advantage of the model is its ability to include oceanographic parameters (e.g., currents) to provide accurate results. The sea ice simulation was performed over Baffin Bay and the Labrador Sea to retrieve important parameters such as ice concentration, thickness, ridging, and drift. Two different forcing models, one with low resolution and another with a high resolution, were used for the estimation of sensitivity of model results. Sea ice behavior over 7 years was simulated to analyze ice formation, melting, and conditions in the region. Validation was based on comparing model results with remote sensing data. The simulated ice concentration correlated well with Advanced Microwave Scanning Radiometer for EOS (AMSR-E) and Ocean and Sea Ice Satellite Application Facility (OSI-SAF) data. Visual comparison of ice thickness trends estimated from the Soil Moisture and Ocean Salinity satellite (SMOS) agreed with the simulation for year 2010-2011.
NASA Astrophysics Data System (ADS)
Cai, Congbo; Dong, Jiyang; Cai, Shuhui; Cheng, En; Chen, Zhong
2006-11-01
Intermolecular multiple quantum coherences (iMQCs) have many potential applications since they can provide interaction information between different molecules within the range of dipolar correlation distance, and can provide new contrast in magnetic resonance imaging (MRI). Because of the non-localized property of dipolar field, and the non-linear property of the Bloch equations incorporating the dipolar field term, the evolution behavior of iMQC is difficult to deduce strictly in many cases. In such cases, simulation studies are very important. Simulation results can not only give a guide to optimize experimental conditions, but also help analyze unexpected experimental results. Based on our product operator matrix and the K-space method for dipolar field calculation, the MRI simulation software was constructed, running on Windows operation system. The non-linear Bloch equations are calculated by a fifth-order Cash-Karp Runge-Kutta formulism. Computational time can be efficiently reduced by separating the effects of chemical shifts and strong gradient field. Using this software, simulation of different kinds of complex MRI sequences can be done conveniently and quickly on general personal computers. Some examples were given. The results were discussed.
Wang, Dongwen
2017-01-01
We analyzed four interactive case simulation tools (ICSTs) from a statewide online clinical education program. Results have shown that ICSTs are increasingly used by HIV healthcare providers. Smart phone has become the primary usage platform for specific ICSTs. Usage patterns depend on particular ICST modules, usage stages, and use contexts. Future design of ICSTs should consider these usage patterns for more effective dissemination of clinical evidence to healthcare providers.
NASA Astrophysics Data System (ADS)
Creech, Angus; Früh, Wolf-Gerrit; Maguire, A. Eoghan
2015-05-01
We present here a computational fluid dynamics (CFD) simulation of Lillgrund offshore wind farm, which is located in the Øresund Strait between Sweden and Denmark. The simulation combines a dynamic representation of wind turbines embedded within a large-eddy simulation CFD solver and uses hr-adaptive meshing to increase or decrease mesh resolution where required. This allows the resolution of both large-scale flow structures around the wind farm, and the local flow conditions at individual turbines; consequently, the response of each turbine to local conditions can be modelled, as well as the resulting evolution of the turbine wakes. This paper provides a detailed description of the turbine model which simulates the interaction between the wind, the turbine rotors, and the turbine generators by calculating the forces on the rotor, the body forces on the air, and instantaneous power output. This model was used to investigate a selection of key wind speeds and directions, investigating cases where a row of turbines would be fully aligned with the wind or at specific angles to the wind. Results shown here include presentations of the spin-up of turbines, the observation of eddies moving through the turbine array, meandering turbine wakes, and an extensive wind farm wake several kilometres in length. The key measurement available for cross-validation with operational wind farm data is the power output from the individual turbines, where the effect of unsteady turbine wakes on the performance of downstream turbines was a main point of interest. The results from the simulations were compared to the performance measurements from the real wind farm to provide a firm quantitative validation of this methodology. Having achieved good agreement between the model results and actual wind farm measurements, the potential of the methodology to provide a tool for further investigations of engineering and atmospheric science problems is outlined.
Virtual Reality Cerebral Aneurysm Clipping Simulation With Real-time Haptic Feedback
Alaraj, Ali; Luciano, Cristian J.; Bailey, Daniel P.; Elsenousi, Abdussalam; Roitberg, Ben Z.; Bernardo, Antonio; Banerjee, P. Pat; Charbel, Fady T.
2014-01-01
Background With the decrease in the number of cerebral aneurysms treated surgically and the increase of complexity of those treated surgically, there is a need for simulation-based tools to teach future neurosurgeons the operative techniques of aneurysm clipping. Objective To develop and evaluate the usefulness of a new haptic-based virtual reality (VR) simulator in the training of neurosurgical residents. Methods A real-time sensory haptic feedback virtual reality aneurysm clipping simulator was developed using the Immersive Touch platform. A prototype middle cerebral artery aneurysm simulation was created from a computed tomography angiogram. Aneurysm and vessel volume deformation and haptic feedback are provided in a 3-D immersive VR environment. Intraoperative aneurysm rupture was also simulated. Seventeen neurosurgery residents from three residency programs tested the simulator and provided feedback on its usefulness and resemblance to real aneurysm clipping surgery. Results Residents felt that the simulation would be useful in preparing for real-life surgery. About two thirds of the residents felt that the 3-D immersive anatomical details provided a very close resemblance to real operative anatomy and accurate guidance for deciding surgical approaches. They believed the simulation is useful for preoperative surgical rehearsal and neurosurgical training. One third of the residents felt that the technology in its current form provided very realistic haptic feedback for aneurysm surgery. Conclusion Neurosurgical residents felt that the novel immersive VR simulator is helpful in their training especially since they do not get a chance to perform aneurysm clippings until very late in their residency programs. PMID:25599200
Granato, Gregory E.; Jones, Susan C.
2015-01-01
Results of this study indicate the potential benefits of the multi-decade simulations that SELDM provides because these simulations quantify risks and uncertainties that affect decisions made with available data and statistics. Results of the SELDM simulations indicate that the WQABI criteria concentrations may be too stringent for evaluating the stormwater quality in receiving streams, highway runoff, and BMP discharges; especially with the substantial uncertainties inherent in selecting representative data.
Topological analysis of nuclear pasta phases
NASA Astrophysics Data System (ADS)
Kycia, Radosław A.; Kubis, Sebastian; Wójcik, Włodzimierz
2017-08-01
In this article the analysis of the result of numerical simulations of pasta phases using algebraic topology methods is presented. These considerations suggest that some phases can be further split into subphases and therefore should be more refined in numerical simulations. The results presented in this article can also be used to relate the Euler characteristic from numerical simulations to the geometry of the phases. The Betti numbers are used as they provide finer characterization of the phases. It is also shown that different boundary conditions give different outcomes.
APS undulator and wiggler sources: Monte-Carlo simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, S.L.; Lai, B.; Viccaro, P.J.
1992-02-01
Standard insertion devices will be provided to each sector by the Advanced Photon Source. It is important to define the radiation characteristics of these general purpose devices. In this document,results of Monte-Carlo simulation are presented. These results, based on the SHADOW program, include the APS Undulator A (UA), Wiggler A (WA), and Wiggler B (WB).
Mass imbalances in EPANET water-quality simulations
NASA Astrophysics Data System (ADS)
Davis, Michael J.; Janke, Robert; Taxon, Thomas N.
2018-04-01
EPANET is widely employed to simulate water quality in water distribution systems. However, in general, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results only for short water-quality time steps. Overly long time steps can yield errors in concentration estimates and can result in situations in which constituent mass is not conserved. The use of a time step that is sufficiently short to avoid these problems may not always be feasible. The absence of EPANET errors or warnings does not ensure conservation of mass. This paper provides examples illustrating mass imbalances and explains how such imbalances can occur because of fundamental limitations in the water-quality routing algorithm used in EPANET. In general, these limitations cannot be overcome by the use of improved water-quality modeling practices. This paper also presents a preliminary event-driven approach that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, toward those obtained using the preliminary event-driven approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations. The results presented in this paper should be of value to those who perform water-quality simulations using EPANET or use the results of such simulations, including utility managers and engineers.
NASA Technical Reports Server (NTRS)
Queen, Eric M.; Omara, Thomas M.
1990-01-01
A realization of a stochastic atmosphere model for use in simulations is presented. The model provides pressure, density, temperature, and wind velocity as a function of latitude, longitude, and altitude, and is implemented in a three degree of freedom simulation package. This implementation is used in the Monte Carlo simulation of an aeroassisted orbital transfer maneuver and results are compared to those of a more traditional approach.
Use of Simulation to Gauge Preparedness for Ebola at a Free-Standing Children's Hospital.
Biddell, Elizabeth A; Vandersall, Brian L; Bailes, Stephanie A; Estephan, Stephanie A; Ferrara, Lori A; Nagy, Kristine M; O'Connell, Joyce L; Patterson, Mary D
2016-04-01
On October 10, 2014, a health care worker exposed to Ebola traveled to Akron, OH, where she became symptomatic. The resulting local public health agencies and health care organization response was unequalled in our region. The day this information was announced, the emergency disaster response was activated at our hospital. The simulation center had 12 hours to prepare simulations to evaluate hospital preparedness should a patient screen positive for Ebola exposure. The team developed hybrid simulation scenarios using standardized patients, mannequin simulators, and task trainers to assess hospital preparedness in the emergency department, transport team, pediatric intensive care unit, and for interdepartmental transfers. These simulations were multidisciplinary and demonstrated gaps in the system that could expose staff to Ebola. The results of these simulations were provided rapidly to the administration. Further simulation cycles were used during the next 2 weeks to identify additional gaps and to evaluate possible solutions.
Longitudinal train dynamics model for a rail transit simulation system
Wang, Jinghui; Rakha, Hesham A.
2018-01-01
The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less
Stone, John E.; Hynninen, Antti-Pekka; Phillips, James C.; Schulten, Klaus
2017-01-01
All-atom molecular dynamics simulations of biomolecules provide a powerful tool for exploring the structure and dynamics of large protein complexes within realistic cellular environments. Unfortunately, such simulations are extremely demanding in terms of their computational requirements, and they present many challenges in terms of preparation, simulation methodology, and analysis and visualization of results. We describe our early experiences porting the popular molecular dynamics simulation program NAMD and the simulation preparation, analysis, and visualization tool VMD to GPU-accelerated OpenPOWER hardware platforms. We report our experiences with compiler-provided autovectorization and compare with hand-coded vector intrinsics for the POWER8 CPU. We explore the performance benefits obtained from unique POWER8 architectural features such as 8-way SMT and its value for particular molecular modeling tasks. Finally, we evaluate the performance of several GPU-accelerated molecular modeling kernels and relate them to other hardware platforms. PMID:29202130
Longitudinal train dynamics model for a rail transit simulation system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Jinghui; Rakha, Hesham A.
The paper develops a longitudinal train dynamics model in support of microscopic railway transportation simulation. The model can be calibrated without any mechanical data making it ideal for implementation in transportation simulators. The calibration and validation work is based on data collected from the Portland light rail train fleet. The calibration procedure is mathematically formulated as a constrained non-linear optimization problem. The validity of the model is assessed by comparing instantaneous model predictions against field observations, and also evaluated in the domains of acceleration/deceleration versus speed and acceleration/deceleration versus distance. A test is conducted to investigate the adequacy of themore » model in simulation implementation. The results demonstrate that the proposed model can adequately capture instantaneous train dynamics, and provides good performance in the simulation test. Thus, the model provides a simple theoretical foundation for microscopic simulators and will significantly support the planning, management and control of railway transportation systems.« less
Multibody Modeling and Simulation for the Mars Phoenix Lander Entry, Descent and Landing
NASA Technical Reports Server (NTRS)
Queen, Eric M.; Prince, Jill L.; Desai, Prasun N.
2008-01-01
A multi-body flight simulation for the Phoenix Mars Lander has been developed that includes high fidelity six degree-of-freedom rigid-body models for the parachute and lander system. The simulation provides attitude and rate history predictions of all bodies throughout the flight, as well as loads on each of the connecting lines. In so doing, a realistic behavior of the descending parachute/lander system dynamics can be simulated that allows assessment of the Phoenix descent performance and identification of potential sensitivities for landing. This simulation provides a complete end-to-end capability of modeling the entire entry, descent, and landing sequence for the mission. Time histories of the parachute and lander aerodynamic angles are presented. The response of the lander system to various wind models and wind shears is shown to be acceptable. Monte Carlo simulation results are also presented.
CVT/PCS phase 1 integrated testing
NASA Technical Reports Server (NTRS)
Mcbrayer, R. O.; Steadman, J. D.
1973-01-01
Five breadboard experiments representing three Sortie Lab experiment disciplines were installed in a payload carrier simulator. A description of the experiments and the payload carrier simulator was provided. An assessment of the experiment interface with the simulator and an assessment of the simulator experiment support systems were presented. The results indicate that a hardware integrator for each experiment is essential; a crew chief, or mission specialist, for systems management and experimenter liaison is a vital function; a payload specialist is a practical concept for experiment integration and operation; an integration fixture for a complex experiment is required to efficiently integrate the experiment and carrier; simultaneous experiment utilization of simulator systems caused unexpected problems in meeting individual experiment requirements; experimenter traffic inside the dual-floor simulator did not hamper experiment operations; and the requirement for zero-g operation will provide a significant design challenge for some experiments.
SIM_EXPLORE: Software for Directed Exploration of Complex Systems
NASA Technical Reports Server (NTRS)
Burl, Michael; Wang, Esther; Enke, Brian; Merline, William J.
2013-01-01
Physics-based numerical simulation codes are widely used in science and engineering to model complex systems that would be infeasible to study otherwise. While such codes may provide the highest- fidelity representation of system behavior, they are often so slow to run that insight into the system is limited. Trying to understand the effects of inputs on outputs by conducting an exhaustive grid-based sweep over the input parameter space is simply too time-consuming. An alternative approach called "directed exploration" has been developed to harvest information from numerical simulators more efficiently. The basic idea is to employ active learning and supervised machine learning to choose cleverly at each step which simulation trials to run next based on the results of previous trials. SIM_EXPLORE is a new computer program that uses directed exploration to explore efficiently complex systems represented by numerical simulations. The software sequentially identifies and runs simulation trials that it believes will be most informative given the results of previous trials. The results of new trials are incorporated into the software's model of the system behavior. The updated model is then used to pick the next round of new trials. This process, implemented as a closed-loop system wrapped around existing simulation code, provides a means to improve the speed and efficiency with which a set of simulations can yield scientifically useful results. The software focuses on the case in which the feedback from the simulation trials is binary-valued, i.e., the learner is only informed of the success or failure of the simulation trial to produce a desired output. The software offers a number of choices for the supervised learning algorithm (the method used to model the system behavior given the results so far) and a number of choices for the active learning strategy (the method used to choose which new simulation trials to run given the current behavior model). The software also makes use of the LEGION distributed computing framework to leverage the power of a set of compute nodes. The approach has been demonstrated on a planetary science application in which numerical simulations are used to study the formation of asteroid families.
Flash Infrared Thermography Contrast Data Analysis Technique
NASA Technical Reports Server (NTRS)
Koshti, Ajay
2014-01-01
This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.
NASA Technical Reports Server (NTRS)
Hailperin, M.
1993-01-01
This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.
VERA Core Simulator methodology for pressurized water reactor cycle depletion
Kochunas, Brendan; Collins, Benjamin; Stimpson, Shane; ...
2017-01-12
This paper describes the methodology developed and implemented in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS) to perform high-fidelity, pressurized water reactor (PWR), multicycle, core physics calculations. Depletion of the core with pin-resolved power and nuclide detail is a significant advance in the state of the art for reactor analysis, providing the level of detail necessary to address the problems of the U.S. Department of Energy Nuclear Reactor Simulation Hub, the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS has three main components: the neutronics solver MPACT, the thermal-hydraulic (T-H) solver COBRA-TF (CTF), and the nuclidemore » transmutation solver ORIGEN. This paper focuses on MPACT and provides an overview of the resonance self-shielding methods, macroscopic-cross-section calculation, two-dimensional/one-dimensional (2-D/1-D) transport, nuclide depletion, T-H feedback, and other supporting methods representing a minimal set of the capabilities needed to simulate high-fidelity models of a commercial nuclear reactor. Results are presented from the simulation of a model of the first cycle of Watts Bar Unit 1. The simulation is within 16 parts per million boron (ppmB) reactivity for all state points compared to cycle measurements, with an average reactivity bias of <5 ppmB for the entire cycle. Comparisons to cycle 1 flux map data are also provided, and the average 2-D root-mean-square (rms) error during cycle 1 is 1.07%. To demonstrate the multicycle capability, a state point at beginning of cycle (BOC) 2 was also simulated and compared to plant data. The comparison of the cycle 2 BOC state has a reactivity difference of +3 ppmB from measurement, and the 2-D rms of the comparison in the flux maps is 1.77%. Lastly, these results provide confidence in VERA-CS’s capability to perform high-fidelity calculations for practical PWR reactor problems.« less
Monte Carlo method for photon heating using temperature-dependent optical properties.
Slade, Adam Broadbent; Aguilar, Guillermo
2015-02-01
The Monte Carlo method for photon transport is often used to predict the volumetric heating that an optical source will induce inside a tissue or material. This method relies on constant (with respect to temperature) optical properties, specifically the coefficients of scattering and absorption. In reality, optical coefficients are typically temperature-dependent, leading to error in simulation results. The purpose of this study is to develop a method that can incorporate variable properties and accurately simulate systems where the temperature will greatly vary, such as in the case of laser-thawing of frozen tissues. A numerical simulation was developed that utilizes the Monte Carlo method for photon transport to simulate the thermal response of a system that allows temperature-dependent optical and thermal properties. This was done by combining traditional Monte Carlo photon transport with a heat transfer simulation to provide a feedback loop that selects local properties based on current temperatures, for each moment in time. Additionally, photon steps are segmented to accurately obtain path lengths within a homogenous (but not isothermal) material. Validation of the simulation was done using comparisons to established Monte Carlo simulations using constant properties, and a comparison to the Beer-Lambert law for temperature-variable properties. The simulation is able to accurately predict the thermal response of a system whose properties can vary with temperature. The difference in results between variable-property and constant property methods for the representative system of laser-heated silicon can become larger than 100K. This simulation will return more accurate results of optical irradiation absorption in a material which undergoes a large change in temperature. This increased accuracy in simulated results leads to better thermal predictions in living tissues and can provide enhanced planning and improved experimental and procedural outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
SIMSAT: An object oriented architecture for real-time satellite simulation
NASA Technical Reports Server (NTRS)
Williams, Adam P.
1993-01-01
Real-time satellite simulators are vital tools in the support of satellite missions. They are used in the testing of ground control systems, the training of operators, the validation of operational procedures, and the development of contingency plans. The simulators must provide high-fidelity modeling of the satellite, which requires detailed system information, much of which is not available until relatively near launch. The short time-scales and resulting high productivity required of such simulator developments culminates in the need for a reusable infrastructure which can be used as a basis for each simulator. This paper describes a major new simulation infrastructure package, the Software Infrastructure for Modelling Satellites (SIMSAT). It outlines the object oriented design methodology used, describes the resulting design, and discusses the advantages and disadvantages experienced in applying the methodology.
NASA Technical Reports Server (NTRS)
Nayagam, Vedha; Berger, Gordon M.; Sacksteder, Kurt R.; Paz, Aaron
2012-01-01
Extraction of mission consumable resources such as water and oxygen from the planetary environment provides valuable reduction in launch-mass and potentially extends the mission duration. Processing of lunar regolith for resource extraction necessarily involves heating and chemical reaction of solid material with processing gases. Vibrofluidization is known to produce effective mixing and control of flow within granular media. In this study we present experimental results for vibrofluidized heat transfer in lunar regolith simulants (JSC-1 and JSC-1A) heated up to 900 C. The results show that the simulant bed height has a significant influence on the vibration induced flow field and heat transfer rates. A taller bed height leads to a two-cell circulation pattern whereas a single-cell circulation was observed for a shorter height. Lessons learned from these test results should provide insight into efficient design of future robotic missions involving In-Situ Resource Utilization.
ERIC Educational Resources Information Center
Njoo, Melanie; de Jong, Ton
This paper contains the results of a study on the importance of discovery learning using computer simulations. The purpose of the study was to identify what constitutes discovery learning and to assess the effects of instructional support measures. College students were observed working with an assignment and a computer simulation in the domain of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hojin; Strachan, Alejandro
2015-11-28
We use large-scale molecular dynamics (MD) to characterize fluid damping between a substrate and an approaching beam. We focus on the near contact regime where squeeze film (where fluid gap is comparable to the mean free path of the gas molecules) and many-body effects in the fluid become dominant. The MD simulations provide explicit description of many-body and non-equilibrium processes in the fluid as well as the surface topography. We study how surface roughness and beam width increases the damping coefficient due to their effect on fluid mobility. We find that the explicit simulations are in good agreement with priormore » direct simulation Monte Carlo results except at near-contact conditions where many-body effects in the compressed fluid lead the increased damping and weaker dependence on beam width. We also show that velocity distributions near the beam edges and for short gaps deviate from the Boltzmann distribution indicating a degree of local non-equilibrium. These results will be useful to parameterize compact models used for microsystem device-level simulations and provide insight into mesoscale simulations of near-contact damping.« less
Mirrored continuum and molecular scale simulations of the ignition of gamma phase RDX
NASA Astrophysics Data System (ADS)
Stewart, D. Scott; Chaudhuri, Santanu; Joshi, Kaushik; Lee, Kiabek
2015-06-01
We consider the ignition of a high-pressure gamma-phase of an explosive crystal of RDX which forms during overdriven shock initiation. Molecular dynamics (MD), with first-principles based or reactive force field based molecular potentials, provides a description of the chemistry as an extremely complex reaction network. The results of the molecular simulation is analyzed by sorting molecular product fragments into high and low molecular groups, to represent identifiable components that can be interpreted by a continuum model. A continuum model based on a Gibbs formulation, that has a single temperature and stress state for the mixture is used to represent the same RDX material and its chemistry. Each component in the continuum model has a corresponding Gibbs continuum potential, that are in turn inferred from molecular MD informed equation of state libraries such as CHEETAH, or are directly simulated by Monte Carlo MD simulations. Information about transport, kinetic rates and diffusion are derived from the MD simulation and the growth of a reactive hot spot in the RDX is studied with both simulations that mirror the other results to provide an essential, continuum/atomistic link. Supported by N000014-12-1-0555, subaward-36561937 (ONR).
Simulating the x-ray image contrast to setup techniques with desired flaw detectability
NASA Astrophysics Data System (ADS)
Koshti, Ajay M.
2015-04-01
The paper provides simulation data of previous work by the author in developing a model for estimating detectability of crack-like flaws in radiography. The methodology is developed to help in implementation of NASA Special x-ray radiography qualification, but is generically applicable to radiography. The paper describes a method for characterizing the detector resolution. Applicability of ASTM E 2737 resolution requirements to the model are also discussed. The paper describes a model for simulating the detector resolution. A computer calculator application, discussed here, also performs predicted contrast and signal-to-noise ratio calculations. Results of various simulation runs in calculating x-ray flaw size parameter and image contrast for varying input parameters such as crack depth, crack width, part thickness, x-ray angle, part-to-detector distance, part-to-source distance, source sizes, and detector sensitivity and resolution are given as 3D surfaces. These results demonstrate effect of the input parameters on the flaw size parameter and the simulated image contrast of the crack. These simulations demonstrate utility of the flaw size parameter model in setting up x-ray techniques that provide desired flaw detectability in radiography. The method is applicable to film radiography, computed radiography, and digital radiography.
Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks.
Wang, Zhijun; Mirdamadi, Reza; Wang, Qing
2016-01-01
Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building.
Prototyping and Simulation of Robot Group Intelligence using Kohonen Networks
Wang, Zhijun; Mirdamadi, Reza; Wang, Qing
2016-01-01
Intelligent agents such as robots can form ad hoc networks and replace human being in many dangerous scenarios such as a complicated disaster relief site. This project prototypes and builds a computer simulator to simulate robot kinetics, unsupervised learning using Kohonen networks, as well as group intelligence when an ad hoc network is formed. Each robot is modeled using an object with a simple set of attributes and methods that define its internal states and possible actions it may take under certain circumstances. As the result, simple, reliable, and affordable robots can be deployed to form the network. The simulator simulates a group of robots as an unsupervised learning unit and tests the learning results under scenarios with different complexities. The simulation results show that a group of robots could demonstrate highly collaborative behavior on a complex terrain. This study could potentially provide a software simulation platform for testing individual and group capability of robots before the design process and manufacturing of robots. Therefore, results of the project have the potential to reduce the cost and improve the efficiency of robot design and building. PMID:28540284
Exploring the content and quality of episodic future simulations in semantic dementia.
Irish, Muireann; Addis, Donna Rose; Hodges, John R; Piguet, Olivier
2012-12-01
Semantic dementia (SD) is a progressive neurodegenerative disorder characterised by the amodal loss of semantic knowledge in the context of relatively preserved recent episodic memory. Recent studies have demonstrated that despite relatively intact episodic memory the capacity for future simulation in SD is profoundly impaired, resulting in an asymmetric profile where past retrieval is significantly better than future simulation (referred to as a past>future effect). Here, we sought to identify the origins of this asymmetric profile by conducting a fine-grained analysis of the contextual details provided during past retrieval and future simulation in SD. Participants with SD (n=14), Alzheimer's disease (n=11), and healthy controls (n=14) had previously completed an experimental past-future interview in which they generated three past events from the previous year, and three future events in the next year, and provided subjective qualitative ratings of vividness, emotional valence, emotional intensity, task difficulty, and personal significance for each event described. Our results confirmed the striking impairment for future simulation in SD, despite a relative preservation of past episodic retrieval. Examination of the contextual details provided for past memories and future simulations revealed significant impairments irrespective of contextual detail type for future simulations in SD, and demonstrated that the future thinking deficit in this cohort was driven by a marked decline in the provision of internal (episodic) event details. In contrast with this past>future effect for internal event details, SD patients displayed a future>past effect for external (non-episodic) event details. Analyses of the qualitative ratings provided for past and future events indicated that SD patients' phenomenological experience did not differ between temporal conditions. Our findings underscore the fact that successful extraction of episodic elements from the past is not sufficient for the generation of novel future simulations in SD. The notable disconnect between objective task performance and patients' subjective experience during future simulation likely reflects the tendency of SD patients to recast entire past events into the future condition. Accordingly, the familiarity of the recapitulated details results in similar ratings of vividness and emotionality across temporal conditions, despite marked differences in the richness of contextual details as the patient moves from the past to the future. Copyright © 2012 Elsevier Ltd. All rights reserved.
Simulation of minimally invasive vascular interventions for training purposes.
Alderliesten, Tanja; Konings, Maurits K; Niessen, Wiro J
2004-01-01
To master the skills required to perform minimally invasive vascular interventions, proper training is essential. A computer simulation environment has been developed to provide such training. The simulation is based on an algorithm specifically developed to simulate the motion of a guide wire--the main instrument used during these interventions--in the human vasculature. In this paper, the design and model of the computer simulation environment is described and first results obtained with phantom and patient data are presented. To simulate minimally invasive vascular interventions, a discrete representation of a guide wire is used which allows modeling of guide wires with different physical properties. An algorithm for simulating the propagation of a guide wire within a vascular system, on the basis of the principle of minimization of energy, has been developed. Both longitudinal translation and rotation are incorporated as possibilities for manipulating the guide wire. The simulation is based on quasi-static mechanics. Two types of energy are introduced: internal energy related to the bending of the guide wire, and external energy resulting from the elastic deformation of the vessel wall. A series of experiments were performed on phantom and patient data. Simulation results are qualitatively compared with 3D rotational angiography data. The results indicate plausible behavior of the simulation.
NASA Technical Reports Server (NTRS)
Knox, James Clinton
2016-01-01
The 1-D axially dispersed plug flow model is a mathematical model widely used for the simulation of adsorption processes. Lumped mass transfer coefficients such as the Glueckauf linear driving force (LDF) term and the axial dispersion coefficient are generally obtained by fitting simulation results to the experimental breakthrough test data. An approach is introduced where these parameters, along with the only free parameter in the energy balance equations, are individually fit to specific test data that isolates the appropriate physics. It is shown that with this approach this model provides excellent simulation results for the C02 on zeolite SA sorbent/sorbate system; however, for the H20 on zeolite SA system, non-physical deviations from constant pattern behavior occur when fitting dispersive experimental results with a large axial dispersion coefficient. A method has also been developed that determines a priori what values of the LDF and axial dispersion terms will result in non-physical simulation results for a specific sorbent/sorbate system when using the one-dimensional axially dispersed plug flow model. A relationship between the steepness of the adsorption equilibrium isotherm as indicated by the distribution factor, the magnitude of the axial dispersion and mass transfer coefficient, and the resulting non-physical behavior is derived. This relationship is intended to provide a guide for avoiding non-physical behavior by limiting the magnitude of the axial dispersion term on the basis of the mass transfer coefficient and distribution factor.
Cabaraban, Maria Theresa I; Kroll, Charles N; Hirabayashi, Satoshi; Nowak, David J
2013-05-01
A distributed adaptation of i-Tree Eco was used to simulate dry deposition in an urban area. This investigation focused on the effects of varying temperature, LAI, and NO2 concentration inputs on estimated NO2 dry deposition to trees in Baltimore, MD. A coupled modeling system is described, wherein WRF provided temperature and LAI fields, and CMAQ provided NO2 concentrations. A base case simulation was conducted using built-in distributed i-Tree Eco tools, and simulations using different inputs were compared against this base case. Differences in land cover classification and tree cover between the distributed i-Tree Eco and WRF resulted in changes in estimated LAI, which in turn resulted in variations in simulated NO2 dry deposition. Estimated NO2 removal decreased when CMAQ-derived concentration was applied to the distributed i-Tree Eco simulation. Discrepancies in temperature inputs did little to affect estimates of NO2 removal by dry deposition to trees in Baltimore. Copyright © 2013 Elsevier Ltd. All rights reserved.
Implicit integration methods for dislocation dynamics
Gardner, D. J.; Woodward, C. S.; Reynolds, D. R.; ...
2015-01-20
In dislocation dynamics simulations, strain hardening simulations require integrating stiff systems of ordinary differential equations in time with expensive force calculations, discontinuous topological events, and rapidly changing problem size. Current solvers in use often result in small time steps and long simulation times. Faster solvers may help dislocation dynamics simulations accumulate plastic strains at strain rates comparable to experimental observations. Here, this paper investigates the viability of high order implicit time integrators and robust nonlinear solvers to reduce simulation run times while maintaining the accuracy of the computed solution. In particular, implicit Runge-Kutta time integrators are explored as a waymore » of providing greater accuracy over a larger time step than is typically done with the standard second-order trapezoidal method. In addition, both accelerated fixed point and Newton's method are investigated to provide fast and effective solves for the nonlinear systems that must be resolved within each time step. Results show that integrators of third order are the most effective, while accelerated fixed point and Newton's method both improve solver performance over the standard fixed point method used for the solution of the nonlinear systems.« less
NASA Astrophysics Data System (ADS)
Zhu, Yawen; Cui, Xiaohong; Wang, Qianqian; Tong, Qiujie; Cui, Xutai; Li, Chenyu; Zhang, Le; Peng, Zhong
2016-11-01
The hardware-in-the-loop simulation system, which provides a precise, controllable and repeatable test conditions, is an important part of the development of the semi-active laser (SAL) guided weapons. In this paper, laser energy chain characteristics were studied, which provides a theoretical foundation for the SAL guidance technology and the hardware-in-the-loop simulation system. Firstly, a simplified equation was proposed to adjust the radar equation according to the principles of the hardware-in-the-loop simulation system. Secondly, a theoretical model and calculation method were given about the energy chain characteristics based on the hardware-in-the-loop simulation system. We then studied the reflection characteristics of target and the distance between the missile and target with major factors such as the weather factors. Finally, the accuracy of modeling was verified by experiment as the values measured experimentally generally follow the theoretical results from the model. And experimental results revealed that ratio of attenuation of the laser energy exhibited a non-linear change vs. pulse number, which were in accord with the actual condition.
WFIRST: Data/Instrument Simulation Support at IPAC
NASA Astrophysics Data System (ADS)
Laine, Seppo; Akeson, Rachel; Armus, Lee; Bennett, Lee; Colbert, James; Helou, George; Kirkpatrick, J. Davy; Meshkat, Tiffany; Paladini, Roberta; Ramirez, Solange; Wang, Yun; Xie, Joan; Yan, Lin
2018-01-01
As part of WFIRST Science Center preparations, the IPAC Science Operations Center (ISOC) maintains a repository of 1) WFIRST data and instrument simulations, 2) tools to facilitate scientific performance and feasibility studies using the WFIRST, and 3) parameters summarizing the current design and predicted performance of the WFIRST telescope and instruments. The simulation repository provides access for the science community to simulation code, tools, and resulting analyses. Examples of simulation code with ISOC-built web-based interfaces include EXOSIMS (for estimating exoplanet yields in CGI surveys) and the Galaxy Survey Exposure Time Calculator. In the future the repository will provide an interface for users to run custom simulations of a wide range of coronagraph instrument (CGI) observations and sophisticated tools for designing microlensing experiments. We encourage those who are generating simulations or writing tools for exoplanet observations with WFIRST to contact the ISOC team so we can work with you to bring these to the attention of the broader astronomical community as we prepare for the exciting science that will be enabled by WFIRST.
Simulation of DKIST solar adaptive optics system
NASA Astrophysics Data System (ADS)
Marino, Jose; Carlisle, Elizabeth; Schmidt, Dirk
2016-07-01
Solar adaptive optics (AO) simulations are a valuable tool to guide the design and optimization process of current and future solar AO and multi-conjugate AO (MCAO) systems. Solar AO and MCAO systems rely on extended object cross-correlating Shack-Hartmann wavefront sensors to measure the wavefront. Accurate solar AO simulations require computationally intensive operations, which have until recently presented a prohibitive computational cost. We present an update on the status of a solar AO and MCAO simulation tool being developed at the National Solar Observatory. The simulation tool is a multi-threaded application written in the C++ language that takes advantage of current large multi-core CPU computer systems and fast ethernet connections to provide accurate full simulation of solar AO and MCAO systems. It interfaces with KAOS, a state of the art solar AO control software developed by the Kiepenheuer-Institut fuer Sonnenphysik, that provides reliable AO control. We report on the latest results produced by the solar AO simulation tool.
Medication Waste Reduction in Pediatric Pharmacy Batch Processes
Veltri, Michael A.; Hamrock, Eric; Mollenkopf, Nicole L.; Holt, Kristen; Levin, Scott
2014-01-01
OBJECTIVES: To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. METHODS: A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. RESULTS: Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. CONCLUSIONS: The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste. PMID:25024671
NASA Astrophysics Data System (ADS)
da Silva, Felipe das Neves Roque; Alves, José Luis Drummond; Cataldi, Marcio
2018-03-01
This paper aims to validate inflow simulations concerning the present-day climate at Água Vermelha Hydroelectric Plant (AVHP—located on the Grande River Basin) based on the Soil Moisture Accounting Procedure (SMAP) hydrological model. In order to provide rainfall data to the SMAP model, the RegCM regional climate model was also used working with boundary conditions from the MIROC model. Initially, present-day climate simulation performed by RegCM model was analyzed. It was found that, in terms of rainfall, the model was able to simulate the main patterns observed over South America. A bias correction technique was also used and it was essential to reduce mistakes related to rainfall simulation. Comparison between rainfall simulations from RegCM and MIROC showed improvements when the dynamical downscaling was performed. Then, SMAP, a rainfall-runoff hydrological model, was used to simulate inflows at Água Vermelha Hydroelectric Plant. After calibration with observed rainfall, SMAP simulations were evaluated in two different periods from the one used in calibration. During calibration, SMAP captures the inflow variability observed at AVHP. During validation periods, the hydrological model obtained better results and statistics with observed rainfall. However, in spite of some discrepancies, the use of simulated rainfall without bias correction captured the interannual flow variability. However, the use of bias removal in the simulated rainfall performed by RegCM brought significant improvements to the simulation of natural inflows performed by SMAP. Not only the curve of simulated inflow became more similar to the observed inflow, but also the statistics improved their values. Improvements were also noticed in the inflow simulation when the rainfall was provided by the regional climate model compared to the global model. In general, results obtained so far prove that there was an added value in rainfall when regional climate model was compared to global climate model and that data from regional models must be bias-corrected so as to improve their results.
2009-09-01
69 VI. CONCLUSIONS AND RECOMMENDATIONS ........................73 A. CONCLUSION ........................................73 1. Benefits of Off...simulation software results and similar results produced from the thesis work conducted by Ozdemir (2009). This study directly benefits decision makers...interested in identifying and benefiting from a cost- effective, readily available aggregated learning tool, with the potential to provide tactical
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Madden, Michael M.; Shelton, Robert; Jackson, A. A.; Castro, Manuel P.; Noble, Deleena M.; Zimmerman, Curtis J.; Shidner, Jeremy D.; White, Joseph P.; Dutta, Doumyo;
2015-01-01
This follow-on paper describes the principal methods of implementing, and documents the results of exercising, a set of six-degree-of-freedom rigid-body equations of motion and planetary geodetic, gravitation and atmospheric models for simple vehicles in a variety of endo- and exo-atmospheric conditions with various NASA, and one popular open-source, engineering simulation tools. This effort is intended to provide an additional means of verification of flight simulations. The models used in this comparison, as well as the resulting time-history trajectory data, are available electronically for persons and organizations wishing to compare their flight simulation implementations of the same models.
A microcontroller-based simulation of dural venous sinus injury for neurosurgical training.
Cleary, Daniel R; Siler, Dominic A; Whitney, Nathaniel; Selden, Nathan R
2018-05-01
OBJECTIVE Surgical simulation has the potential to supplement and enhance traditional resident training. However, the high cost of equipment and limited number of available scenarios have inhibited wider integration of simulation in neurosurgical education. In this study the authors provide initial validation of a novel, low-cost simulation platform that recreates the stress of surgery using a combination of hands-on, model-based, and computer elements. Trainee skill was quantified using multiple time and performance measures. The simulation was initially validated using trainees at the start of their intern year. METHODS The simulation recreates intraoperative superior sagittal sinus injury complicated by air embolism. The simulator model consists of 2 components: a reusable base and a disposable craniotomy pack. The simulator software is flexible and modular to allow adjustments in difficulty or the creation of entirely new clinical scenarios. The reusable simulator base incorporates a powerful microcomputer and multiple sensors and actuators to provide continuous feedback to the software controller, which in turn adjusts both the screen output and physical elements of the model. The disposable craniotomy pack incorporates 3D-printed sections of model skull and brain, as well as artificial dura that incorporates a model sagittal sinus. RESULTS Twelve participants at the 2015 Western Region Society of Neurological Surgeons postgraduate year 1 resident course ("boot camp") provided informed consent and enrolled in a study testing the prototype device. Each trainee was required to successfully create a bilateral parasagittal craniotomy, repair a dural sinus tear, and recognize and correct an air embolus. Participant stress was measured using a heart rate wrist monitor. After participation, each resident completed a 13-question categorical survey. CONCLUSIONS All trainee participants experienced tachycardia during the simulation, although the point in the simulation at which they experienced tachycardia varied. Survey results indicated that participants agreed the simulation was realistic, created stress, and was a useful tool in training neurosurgical residents. This simulator represents a novel, low-cost approach for hands-on training that effectively teaches and tests residents without risk of patient injury.
A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study.
Kaplan, David; Chen, Jianshen
2012-07-01
A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for three methods of implementation: propensity score stratification, weighting, and optimal full matching. Three simulation studies and one case study are presented to elaborate the proposed two-step Bayesian propensity score approach. Results of the simulation studies reveal that greater precision in the propensity score equation yields better recovery of the frequentist-based treatment effect. A slight advantage is shown for the Bayesian approach in small samples. Results also reveal that greater precision around the wrong treatment effect can lead to seriously distorted results. However, greater precision around the correct treatment effect parameter yields quite good results, with slight improvement seen with greater precision in the propensity score equation. A comparison of coverage rates for the conventional frequentist approach and proposed Bayesian approach is also provided. The case study reveals that credible intervals are wider than frequentist confidence intervals when priors are non-informative.
A Method for Generating Reduced-Order Linear Models of Multidimensional Supersonic Inlets
NASA Technical Reports Server (NTRS)
Chicatelli, Amy; Hartley, Tom T.
1998-01-01
Simulation of high speed propulsion systems may be divided into two categories, nonlinear and linear. The nonlinear simulations are usually based on multidimensional computational fluid dynamics (CFD) methodologies and tend to provide high resolution results that show the fine detail of the flow. Consequently, these simulations are large, numerically intensive, and run much slower than real-time. ne linear simulations are usually based on large lumping techniques that are linearized about a steady-state operating condition. These simplistic models often run at or near real-time but do not always capture the detailed dynamics of the plant. Under a grant sponsored by the NASA Lewis Research Center, Cleveland, Ohio, a new method has been developed that can be used to generate improved linear models for control design from multidimensional steady-state CFD results. This CFD-based linear modeling technique provides a small perturbation model that can be used for control applications and real-time simulations. It is important to note the utility of the modeling procedure; all that is needed to obtain a linear model of the propulsion system is the geometry and steady-state operating conditions from a multidimensional CFD simulation or experiment. This research represents a beginning step in establishing a bridge between the controls discipline and the CFD discipline so that the control engineer is able to effectively use multidimensional CFD results in control system design and analysis.
Pereira, D; Gomes, P; Faria, S; Cruz-Correia, R; Coimbra, M
2016-08-01
Auscultation is currently both a powerful screening tool, providing a cheap and quick initial assessment of a patient's clinical condition, and a hard skill to master. The teaching of auscultation in Universities is today reduced to an unsuitable number of hours. Virtual patient simulators can potentially mitigate this problem, by providing an interesting high-quality alternative to teaching with real patients or patient simulators. In this paper we evaluate the pedagogical impact of using a virtual patient simulation technology in a short workshop format for medical students, training them to detect cardiac pathologies. Results showed a significant improvement (+16%) in the differentiation between normal and pathological cases, although longer duration formats seem to be needed to accurately identify specific pathologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Fei; Pratt, Annabelle; Bialek, Tom
2016-11-21
This paper reports on tools and methodologies developed to study the impact of adding rooftop photovoltaic (PV) systems, with and without the ability to provide voltage support, on the voltage profile of distribution feeders. Simulation results are provided from a study of a specific utility feeder. The simulation model of the utility distribution feeder was built in OpenDSS and verified by comparing the simulated voltages to field measurements. First, we set all PV systems to operate at unity power factor and analyzed the impact on feeder voltages. Then we conducted multiple simulations with voltage support activated for all the smartmore » PV inverters. These included different constant power factor settings and volt/VAR controls.« less
NASA Technical Reports Server (NTRS)
Mercer, Joey; Callantine, Todd; Martin, Lynne
2012-01-01
A recent human-in-the-loop simulation in the Airspace Operations Laboratory (AOL) at NASA's Ames Research Center investigated the robustness of Controller-Managed Spacing (CMS) operations. CMS refers to AOL-developed controller tools and procedures for enabling arrivals to conduct efficient Optimized Profile Descents with sustained high throughput. The simulation provided a rich data set for examining how a traffic management supervisor and terminal-area controller participants used the CMS tools and coordinated to respond to off-nominal events. This paper proposes quantitative measures for characterizing the participants responses. Case studies of go-around events, replicated during the simulation, provide insights into the strategies employed and the role the CMS tools played in supporting them.
An application of sedimentation simulation in Tahe oilfield
NASA Astrophysics Data System (ADS)
Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He
2017-12-01
The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.
Electric Water Heater Modeling and Control Strategies for Demand Response
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diao, Ruisheng; Lu, Shuai; Elizondo, Marcelo A.
2012-07-22
Abstract— Demand response (DR) has a great potential to provide balancing services at normal operating conditions and emergency support when a power system is subject to disturbances. Effective control strategies can significantly relieve the balancing burden of conventional generators and reduce investment on generation and transmission expansion. This paper is aimed at modeling electric water heaters (EWH) in households and tests their response to control strategies to implement DR. The open-loop response of EWH to a centralized signal is studied by adjusting temperature settings to provide regulation services; and two types of decentralized controllers are tested to provide frequency supportmore » following generator trips. EWH models are included in a simulation platform in DIgSILENT to perform electromechanical simulation, which contains 147 households in a distribution feeder. Simulation results show the dependence of EWH response on water heater usage . These results provide insight suggestions on the need of control strategies to achieve better performance for demand response implementation. Index Terms— Centralized control, decentralized control, demand response, electrical water heater, smart grid« less
An epidemiological modeling and data integration framework.
Pfeifer, B; Wurz, M; Hanser, F; Seger, M; Netzer, M; Osl, M; Modre-Osprian, R; Schreier, G; Baumgartner, C
2010-01-01
In this work, a cellular automaton software package for simulating different infectious diseases, storing the simulation results in a data warehouse system and analyzing the obtained results to generate prediction models as well as contingency plans, is proposed. The Brisbane H3N2 flu virus, which has been spreading during the winter season 2009, was used for simulation in the federal state of Tyrol, Austria. The simulation-modeling framework consists of an underlying cellular automaton. The cellular automaton model is parameterized by known disease parameters and geographical as well as demographical conditions are included for simulating the spreading. The data generated by simulation are stored in the back room of the data warehouse using the Talend Open Studio software package, and subsequent statistical and data mining tasks are performed using the tool, termed Knowledge Discovery in Database Designer (KD3). The obtained simulation results were used for generating prediction models for all nine federal states of Austria. The proposed framework provides a powerful and easy to handle interface for parameterizing and simulating different infectious diseases in order to generate prediction models and improve contingency plans for future events.
Couto, Thomaz Bittencourt; Kerrey, Benjamin T; Taylor, Regina G; FitzGerald, Michael; Geis, Gary L
2015-04-01
Pediatric emergencies require effective teamwork. These skills are developed and demonstrated in actual emergencies and in simulated environments, including simulation centers (in center) and the real care environment (in situ). Our aims were to compare teamwork performance across these settings and to identify perceived educational strengths and weaknesses between simulated settings. We hypothesized that teamwork performance in actual emergencies and in situ simulations would be higher than for in-center simulations. A retrospective, video-based assessment of teamwork was performed in an academic, pediatric level 1 trauma center, using the Team Emergency Assessment Measure (TEAM) tool (range, 0-44) among emergency department providers (physicians, nurses, respiratory therapists, paramedics, patient care assistants, and pharmacists). A survey-based, cross-sectional assessment was conducted to determine provider perceptions regarding simulation training. One hundred thirty-two videos, 44 from each setting, were reviewed. Mean total TEAM scores were similar and high in all settings (31.2 actual, 31.1 in situ, and 32.3 in-center, P = 0.39). Of 236 providers, 154 (65%) responded to the survey. For teamwork training, in situ simulation was considered more realistic (59% vs. 10%) and more effective (45% vs. 15%) than in-center simulation. In a video-based study in an academic pediatric institution, ratings of teamwork were relatively high among actual resuscitations and 2 simulation settings, substantiating the influence of simulation-based training on instilling a culture of communication and teamwork. On the basis of survey results, providers favored the in situ setting for teamwork training and suggested an expansion of our existing in situ program.
Medical Simulation Practices 2010 Survey Results
NASA Technical Reports Server (NTRS)
McCrindle, Jeffrey J.
2011-01-01
Medical Simulation Centers are an essential component of our learning infrastructure to prepare doctors and nurses for their careers. Unlike the military and aerospace simulation industry, very little has been published regarding the best practices currently in use within medical simulation centers. This survey attempts to provide insight into the current simulation practices at medical schools, hospitals, university nursing programs and community college nursing programs. Students within the MBA program at Saint Joseph's University conducted a survey of medical simulation practices during the summer 2010 semester. A total of 115 institutions responded to the survey. The survey resus discuss overall effectiveness of current simulation centers as well as the tools and techniques used to conduct the simulation activity
Eighteenth Space Simulation Conference: Space Mission Success Through Testing
NASA Technical Reports Server (NTRS)
Stecher, Joseph L., III (Compiler)
1994-01-01
The Institute of Environmental Sciences' Eighteenth Space Simulation Conference, 'Space Mission Success Through Testing' provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, program/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme 'Space Mission Success Through Testing.'
An algorithm for the automatic synchronization of Omega receivers
NASA Technical Reports Server (NTRS)
Stonestreet, W. M.; Marzetta, T. L.
1977-01-01
The Omega navigation system and the requirement for receiver synchronization are discussed. A description of the synchronization algorithm is provided. The numerical simulation and its associated assumptions were examined and results of the simulation are presented. The suggested form of the synchronization algorithm and the suggested receiver design values were surveyed. A Fortran of the synchronization algorithm used in the simulation was also included.
Heterojunction Solid-State Devices for Millimeter-Wave Sources.
1983-10-01
technology such as MBE and/or OK-CVD will be required. Our large-signal, numerical WATT device simulations are the first to predict from basic transport...results are due to an improved method for determining semiconductor material parameters. We use a theoretical Monte Carlo materials simulation ... simulations . These calculations have helped provide insight into velocity overshoot and ballistic transport phenomena. We find that ballistic or near
The Seventeenth Space Simulation Conference. Terrestrial Test for Space Success
NASA Technical Reports Server (NTRS)
Stecher, Joseph L., III (Compiler)
1992-01-01
The Institute of Environmental Sciences' Seventeenth Space Simulation Conference, 'Terrestrial Test for Space Success' provided participants with a forum to acquire and exchange information on the state of the art in space simulation, test technology, atomic oxygen, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme of 'terrestrial test for space success.'
20th Space Simulation Conference: The Changing Testing Paradigm
NASA Technical Reports Server (NTRS)
Stecher, Joseph L., III (Compiler)
1998-01-01
The Institute of Environmental Sciences' Twentieth Space Simulation Conference, "The Changing Testing Paradigm" provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, program/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme "The Changing Testing Paradigm."
20th Space Simulation Conference: The Changing Testing Paradigm
NASA Technical Reports Server (NTRS)
Stecher, Joseph L., III (Compiler)
1999-01-01
The Institute of Environmental Sciences and Technology's Twentieth Space Simulation Conference, "The Changing Testing Paradigm" provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, program/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme "The Changing Testing Paradigm."
NASA Astrophysics Data System (ADS)
Rankin, Drew J.; Jiang, Jin
2011-04-01
Verification and validation (V&V) of safety control system quality and performance is required prior to installing control system hardware within nuclear power plants (NPPs). Thus, the objective of the hardware-in-the-loop (HIL) platform introduced in this paper is to verify the functionality of these safety control systems. The developed platform provides a flexible simulated testing environment which enables synchronized coupling between the real and simulated world. Within the platform, National Instruments (NI) data acquisition (DAQ) hardware provides an interface between a programmable electronic system under test (SUT) and a simulation computer. Further, NI LabVIEW resides on this remote DAQ workstation for signal conversion and routing between Ethernet and standard industrial signals as well as for user interface. The platform is applied to the testing of a simplified implementation of Canadian Deuterium Uranium (CANDU) shutdown system no. 1 (SDS1) which monitors only the steam generator level of the simulated NPP. CANDU NPP simulation is performed on a Darlington NPP desktop training simulator provided by Ontario Power Generation (OPG). Simplified SDS1 logic is implemented on an Invensys Tricon v9 programmable logic controller (PLC) to test the performance of both the safety controller and the implemented logic. Prior to HIL simulation, platform availability of over 95% is achieved for the configuration used during the V&V of the PLC. Comparison of HIL simulation results to benchmark simulations shows good operational performance of the PLC following a postulated initiating event (PIE).
Mars Tumbleweed Simulation Using Singular Perturbation Theory
NASA Technical Reports Server (NTRS)
Raiszadeh, Behzad; Calhoun, Phillip
2005-01-01
The Mars Tumbleweed is a new surface rover concept that utilizes Martian winds as the primary source of mobility. Several designs have been proposed for the Mars Tumbleweed, all using aerodynamic drag to generate force for traveling about the surface. The Mars Tumbleweed, in its deployed configuration, must be large and lightweight to provide the ratio of drag force to rolling resistance necessary to initiate motion from the Martian surface. This paper discusses the dynamic simulation details of a candidate Tumbleweed design. The dynamic simulation model must properly evaluate and characterize the motion of the tumbleweed rover to support proper selection of system design parameters. Several factors, such as model flexibility, simulation run times, and model accuracy needed to be considered in modeling assumptions. The simulation was required to address the flexibility of the rover and its interaction with the ground, and properly evaluate its mobility. Proper assumptions needed to be made such that the simulated dynamic motion is accurate and realistic while not overly burdened by long simulation run times. This paper also shows results that provided reasonable correlation between the simulation and a drop/roll test of a tumbleweed prototype.
Computer Simulation for Emergency Incident Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D L
2004-12-03
This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident responsemore » and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.« less
Simulation Test Of Descent Advisor
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Green, Steven M.
1991-01-01
Report describes piloted-simulation test of Descent Advisor (DA), subsystem of larger automation system being developed to assist human air-traffic controllers and pilots. Focuses on results of piloted simulation, in which airline crews executed controller-issued descent advisories along standard curved-path arrival routes. Crews able to achieve arrival-time precision of plus or minus 20 seconds at metering fix. Analysis of errors generated in turns resulted in further enhancements of algorithm to increase accuracies of its predicted trajectories. Evaluations by pilots indicate general support for DA concept and provide specific recommendations for improvement.
System Dynamics Modeling for Supply Chain Information Sharing
NASA Astrophysics Data System (ADS)
Feng, Yang
In this paper, we try to use the method of system dynamics to model supply chain information sharing. Firstly, we determine the model boundaries, establish system dynamics model of supply chain before information sharing, analyze the model's simulation results under different changed parameters and suggest improvement proposal. Then, we establish system dynamics model of supply chain information sharing and make comparison and analysis on the two model's simulation results, to show the importance of information sharing in supply chain management. We wish that all these simulations would provide scientific supports for enterprise decision-making.
NASA Technical Reports Server (NTRS)
Lightsey, W. D.
1990-01-01
A digital computer simulation is used to determine if the extreme ultraviolet explorer (EUVE) reaction wheels can provide sufficient torque and momentum storage capability to meet the space infrared telescope facility (SIRTF) maneuver requirements. A brief description of the pointing control system (PCS) and the sensor and actuator dynamic models used in the simulation is presented. A model to represent a disturbance such as fluid sloshing is developed. Results developed with the simulation, and a discussion of these results are presented.
Ventilation of Animal Shelters in Wildland Fire Scenarios
NASA Astrophysics Data System (ADS)
Bova, A. S.; Bohrer, G.; Dickinson, M. B.
2009-12-01
The effects of wildland fires on cavity-nesting birds and bats, as well as fossorial mammals and burrow-using reptiles, are of considerable interest to the fire management community. However, relatively little is known about the degree of protection afforded by various animal shelters in wildland fire events. We present results from our ongoing investigation, utilizing NIST’s Fire Dynamics Simulator (FDS) and experimental data, of the effectiveness of common shelter configurations in protecting animals from combustion products. We compare two sets of simulations with observed experimental results. In the first set, wind tunnel experiments on single-entry room ventilation by Larsen and Heiselberg (2008) were simulated in a large domain resolved into 10 cm cubic cells. The set of 24 simulations comprised all combinations of incident wind speeds of 1,3 and 5 m/s; angles of attack of 0, 45, 90 and 180 degrees from the horizontal normal to the entrance; and temperature differences of 0 and 10 degrees C between the building interior and exterior. Simulation results were in good agreement with experimental data, thus providing a validation of FDS code for further ventilation experiments. In the second set, a cubic simulation domain of ~1m on edge and resolved into 1 cm cubic cells, was set up to represent the experiments by Ar et al. (2004) of wind-induced ventilation of woodpecker cavities. As in the experiments, we simulated wind parallel and perpendicular to the cavity entrance with different mean forcing velocities, and monitored the rates of evacuation of a neutral-buoyancy tracer from the cavity. Simulated ventilation rates in many, though not all, cases fell within the range of experimental data. Reasons for these differences, which include vagueness in the experimental setup, will be discussed. Our simulations provide a tool to estimate the viability of an animal in a shelter as a function of the shelter geometry and the fire intensity. In addition to the above, we explore the role of turbulence and its effect on ventilation rates, especially in single-entrance shelters. The goal of this work is to provide engineering formulas to estimate the probable levels of harmful or irritating combustion products in animal shelters during wildland fires.
Tepper, Ronnie
2017-01-01
Background Workplaces today demand graduates who are prepared with field-specific knowledge, advanced social skills, problem-solving skills, and integration capabilities. Meeting these goals with didactic learning (DL) is becoming increasingly difficult. Enhanced training methods that would better prepare tomorrow’s graduates must be more engaging and game-like, such as feedback based e-learning or simulation-based training, while saving time. Empirical evidence regarding the effectiveness of advanced learning methods is lacking. Objective quantitative research comparing advanced training methods with DL is sparse. Objectives This quantitative study assessed the effectiveness of a computerized interactive simulator coupled with an instructor who monitored students’ progress and provided Web-based immediate feedback. Methods A low-cost, globally accessible, telemedicine simulator, developed at the Technion—Israel Institute of Technology, Haifa, Israel—was used. A previous study in the field of interventional cardiology, evaluating the efficacy of the simulator to enhanced learning via knowledge exams, presented promising results of average scores varying from 94% after training and 54% before training (n=20) with P<.001. Two independent experiments involving obstetrics and gynecology (Ob-Gyn) physicians and senior ultrasound sonographers, with 32 subjects, were conducted using a new interactive concept of the WOZ (Wizard of OZ) simulator platform. The contribution of an instructor to learning outcomes was evaluated by comparing students’ knowledge before and after each interactive instructor-led session as well as after fully automated e-learning in the field of Ob-Gyn. Results from objective knowledge tests were analyzed using hypothesis testing and model fitting. Results A significant advantage (P=.01) was found in favor of the WOZ training approach. Content type and training audience were not significant. Conclusions This study evaluated the contribution of an integrated teaching environment using a computerized interactive simulator, with an instructor providing immediate Web-based immediate feedback to trainees. Involvement of an instructor in the simulation-based training process provided better learning outcomes that varied training content and trainee populations did not affect the overall learning gains. PMID:28432039
Large eddy simulations of time-dependent and buoyancy-driven channel flows
NASA Technical Reports Server (NTRS)
Cabot, William H.
1993-01-01
The primary goal of this work has been to assess the performance of the dynamic SGS model in the large eddy simulation (LES) of channel flows in a variety of situations, viz., in temporal development of channel flow turned by a transverse pressure gradient and especially in buoyancy-driven turbulent flows such as Rayleigh-Benard and internally heated channel convection. For buoyancy-driven flows, there are additional buoyant terms that are possible in the base models, and one objective has been to determine if the dynamic SGS model results are sensitive to such terms. The ultimate goal is to determine the minimal base model needed in the dynamic SGS model to provide accurate results in flows with more complicated physical features. In addition, a program of direct numerical simulation (DNS) of fully compressible channel convection has been undertaken to determine stratification and compressibility effects. These simulations are intended to provide a comparative base for performing the LES of compressible (or highly stratified, pseudo-compressible) convection at high Reynolds number in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stansfield, S.; Shawver, D.; Sobel, A.
This paper presents a prototype virtual reality (VR) system for training medical first responders. The initial application is to battlefield medicine and focuses on the training of medical corpsmen and other front-line personnel who might be called upon to provide emergency triage on the battlefield. The system is built upon Sandia`s multi-user, distributed VR platform and provides an interactive, immersive simulation capability. The user is represented by an Avatar and is able to manipulate his virtual instruments and carry out medical procedures. A dynamic casualty simulation provides realistic cues to the patient`s condition (e.g. changing blood pressure and pulse) andmore » responds to the actions of the trainee (e.g. a change in the color of a patient`s skin may result from a check of the capillary refill rate). The current casualty simulation is of an injury resulting in a tension pneumothorax. This casualty model was developed by the University of Pennsylvania and integrated into the Sandia MediSim system.« less
NASA Technical Reports Server (NTRS)
Myers, Jerry G.; Young, M.; Goodenow, Debra A.; Keenan, A.; Walton, M.; Boley, L.
2015-01-01
Model and simulation (MS) credibility is defined as, the quality to elicit belief or trust in MS results. NASA-STD-7009 [1] delineates eight components (Verification, Validation, Input Pedigree, Results Uncertainty, Results Robustness, Use History, MS Management, People Qualifications) that address quantifying model credibility, and provides guidance to the model developers, analysts, and end users for assessing the MS credibility. Of the eight characteristics, input pedigree, or the quality of the data used to develop model input parameters, governing functions, or initial conditions, can vary significantly. These data quality differences have varying consequences across the range of MS application. NASA-STD-7009 requires that the lowest input data quality be used to represent the entire set of input data when scoring the input pedigree credibility of the model. This requirement provides a conservative assessment of model inputs, and maximizes the communication of the potential level of risk of using model outputs. Unfortunately, in practice, this may result in overly pessimistic communication of the MS output, undermining the credibility of simulation predictions to decision makers. This presentation proposes an alternative assessment mechanism, utilizing results parameter robustness, also known as model input sensitivity, to improve the credibility scoring process for specific simulations.
Refinement of Objective Motion Cueing Criteria Investigation Based on Three Flight Tasks
NASA Technical Reports Server (NTRS)
Zaal, Petrus M. T.; Schroeder, Jeffery A.; Chung, William W.
2017-01-01
The objective of this paper is to refine objective motion cueing criteria for commercial transport simulators based on pilots' performance in three flying tasks. Actuator hardware and software algorithms determine motion cues. Today, during a simulator qualification, engineers objectively evaluate only the hardware. Pilot inspectors subjectively assess the overall motion cueing system (i.e., hardware plus software); however, it is acknowledged that pinpointing any deficiencies that might arise to either hardware or software is challenging. ICAO 9625 has an Objective Motion Cueing Test (OMCT), which is now a required test in the FAA's part 60 regulations for new devices, evaluating the software and hardware together; however, it lacks accompanying fidelity criteria. Hosman has documented OMCT results for a statistical sample of eight simulators which is useful, but having validated criteria would be an improvement. In a previous experiment, we developed initial objective motion cueing criteria that this paper is trying to refine. Sinacori suggested simple criteria which are in reasonable agreement with much of the literature. These criteria often necessitate motion displacements greater than most training simulators can provide. While some of the previous work has used transport aircraft in their studies, the majority used fighter aircraft or helicopters. Those that used transport aircraft considered degraded flight characteristics. As a result, earlier criteria lean more towards being sufficient, rather than necessary, criteria for typical transport aircraft training applications. Considering the prevalence of 60-inch, six-legged hexapod training simulators, a relevant question is "what are the necessary criteria that can be used with the ICAO 9625 diagnostic?" This study adds to the literature as follows. First, it examines well-behaved transport aircraft characteristics, but in three challenging tasks. The tasks are equivalent to the ones used in our previous experiment, allowing us to directly compare the results and add to the previous data. Second, it uses the Vertical Motion Simulator (VMS), the world's largest vertical displacement simulator. This allows inclusion of relatively large motion conditions, much larger than a typical training simulator can provide. Six new motion configurations were used that explore the motion responses between the initial objective motion cueing boundaries found in a previous experiment and what current hexapod simulators typically provide. Finally, a sufficiently large pilot pool added statistical reliability to the results.
NASA Astrophysics Data System (ADS)
Scudder, J. D.; Karimabadi, H.; Daughton, W. S.
2013-12-01
Interpretations of 2D simulations of magnetic reconnection are greatly simplified by using the flux function, usually the out of plane component of the vector potential. This theoretical device is no longer available when simulations are analyzed in 3-D. We illustrate the results of determining the locale rates of flux slippage in simulations by a technique based on Maxwell's equations. The technique recovers the usual results obtained for the flux function in 2D simulations, but remains viable in 3D simulations where there is no flux function. The method has also been successfully tested for full PIC simulations where reconnection is geometrically forbiddden. While such layers possess measurable flux slippages (diffusion) their level is not as strong as recorded in known 2D PIC reconnection sites using the same methodology. This approach will be used to explore the spatial incidence and strength of flux slippages across a 3D, asymmetric, strong guide field run discussed previously in the literature. Regions of diffusive behavior are illustrated where LHDI has been previously identified out on the separatrices, while much stronger flux slippages, typical of the X-regions of 2D simulations, are shown to occur elsewhere throughout the simulation. These results suggest that reconnection requires sufficiently vigorous flux slippage to be self sustaining, while non-zero flux slippage can and does occur without being at the reconnection site. A cross check of this approach is provided by the mixing ratio of tagged simulation particles of known spatial origin discussed by Daughton et al., 2013 (this meeting); they provide an integral measure of flux slippage up to the present point in the simulation. We will discuss the correlations between our Maxwell based flux slippage rates and the inferred rates of change of this mixing ratio (as recorded in the local fluid frame).
Additional confirmation of the validity of laboratory simulation of cloud radiances
NASA Technical Reports Server (NTRS)
Davis, J. M.; Cox, S. K.
1986-01-01
The results of a laboratory experiment are presented that provide additional verification of the methodology adopted for simulation of the radiances reflected from fields of optically thick clouds using the Cloud Field Optical Simulator (CFOS) at Colorado State University. The comparison of these data with their theoretically derived counterparts indicates that the crucial mechanism of cloud-to-cloud radiance field interaction is accurately simulated in the CFOS experiments and adds confidence to the manner in which the optical depth is scaled.
Spacecraft Guidance, Navigation, and Control Visualization Tool
NASA Technical Reports Server (NTRS)
Mandic, Milan; Acikmese, Behcet; Blackmore, Lars
2011-01-01
G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.
Cognitive Modeling for Agent-Based Simulation of Child Maltreatment
NASA Astrophysics Data System (ADS)
Hu, Xiaolin; Puddy, Richard
This paper extends previous work to develop cognitive modeling for agent-based simulation of child maltreatment (CM). The developed model is inspired from parental efficacy, parenting stress, and the theory of planned behavior. It provides an explanatory, process-oriented model of CM and incorporates causality relationship and feedback loops from different factors in the social ecology in order for simulating the dynamics of CM. We describe the model and present simulation results to demonstrate the features of this model.
21st Space Simulation Conference: The Future of Space Simulation Testing in the 21st Century
NASA Technical Reports Server (NTRS)
Stecher, Joseph L., III (Compiler)
2000-01-01
The Institute of Environmental Sciences and Technology's Twenty-first Space Simulation Conference, "The Future of Space Testing in the 21st Century" provided participants with a forum to acquire and exchange information on the state-of-the-art in space simulation, test technology, atomic oxygen, programs/system testing, dynamics testing, contamination, and materials. The papers presented at this conference and the resulting discussions carried out the conference theme "The Future of Space Testing in the 21st Century."
Isele-Holder, Rolf E; Mitchell, Wayne; Ismail, Ahmed E
2012-11-07
For inhomogeneous systems with interfaces, the inclusion of long-range dispersion interactions is necessary to achieve consistency between molecular simulation calculations and experimental results. For accurate and efficient incorporation of these contributions, we have implemented a particle-particle particle-mesh Ewald solver for dispersion (r(-6)) interactions into the LAMMPS molecular dynamics package. We demonstrate that the solver's O(N log N) scaling behavior allows its application to large-scale simulations. We carefully determine a set of parameters for the solver that provides accurate results and efficient computation. We perform a series of simulations with Lennard-Jones particles, SPC/E water, and hexane to show that with our choice of parameters the dependence of physical results on the chosen cutoff radius is removed. Physical results and computation time of these simulations are compared to results obtained using either a plain cutoff or a traditional Ewald sum for dispersion.
Numerical simulation of a 100-ton ANFO detonation
NASA Astrophysics Data System (ADS)
Weber, P. W.; Millage, K. K.; Crepeau, J. E.; Happ, H. J.; Gitterman, Y.; Needham, C. E.
2015-03-01
This work describes the results from a US government-owned hydrocode (SHAMRC, Second-Order Hydrodynamic Automatic Mesh Refinement Code) that simulated an explosive detonation experiment with 100,000 kg of Ammonium Nitrate-Fuel Oil (ANFO) and 2,080 kg of Composition B (CompB). The explosive surface charge was nearly hemispherical and detonated in desert terrain. Two-dimensional axisymmetric (2D) and three-dimensional (3D) simulations were conducted, with the 3D model providing a more accurate representation of the experimental setup geometry. Both 2D and 3D simulations yielded overpressure and impulse waveforms that agreed qualitatively with experiment, including the capture of the secondary shock observed in the experiment. The 2D simulation predicted the primary shock arrival time correctly but secondary shock arrival time was early. The 2D-predicted impulse waveforms agreed very well with the experiment, especially at later calculation times, and prediction of the early part of the impulse waveform (associated with the initial peak) was better quantitatively for 2D compared to 3D. The 3D simulation also predicted the primary shock arrival time correctly, and secondary shock arrival times in 3D were closer to the experiment than in the 2D results. The 3D-predicted impulse waveform had better quantitative agreement than 2D for the later part of the impulse waveform. The results of this numerical study show that SHAMRC may be used reliably to predict phenomena associated with the 100-ton detonation. The ultimate fidelity of the simulations was limited by both computer time and memory. The results obtained provide good accuracy and indicate that the code is well suited to predicting the outcomes of explosive detonations.
Validating clustering of molecular dynamics simulations using polymer models
2011-01-01
Background Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. Results We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. Conclusions We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the first to utilize model polymers to rigorously test the utility of clustering algorithms for studying biopolymers. PMID:22082218
2017-01-01
Background Despite clear evidence that antibiotics do not cure viral infections, the problem of unnecessary prescribing of antibiotics in ambulatory care persists, and in some cases, prescribing patterns have increased. The overuse of antibiotics for treating viral infections has created numerous economic and clinical consequences including increased medical costs due to unnecessary hospitalizations, antibiotic resistance, disruption of gut bacteria, and obesity. Recent research has underscored the importance of collaborative patient-provider communication as a means to reduce the high rates of unnecessary prescriptions for antibiotics. However, most patients and providers do not feel prepared to engage in such challenging conversations. Objectives The aim of this pilot study was to assess the ability of a brief 15-min simulated role-play conversation with virtual humans to serve as a preliminary step to help health care providers and patients practice, and learn how to engage in effective conversations about antibiotics overuse. Methods A total of 69 participants (35 providers and 34 patients) completed the simulation once in one sitting. A pre-post repeated measures design was used to assess changes in patients’ and providers’ self-reported communication behaviors, activation, and preparedness, intention, and confidence to effectively communicate in the patient-provider encounter. Changes in patients’ knowledge and beliefs regarding antibiotic use were also evaluated. Results Patients experienced a short-term positive improvement in beliefs about appropriate antibiotic use for infection (F1,30=14.10, P=.001). Knowledge scores regarding the correct uses of antibiotics improved immediately postsimulation, but decreased at the 1-month follow-up (F1,30=31.16, P<.001). There was no change in patient activation and shared decision-making (SDM) scores in the total sample of patients (P>.10) Patients with lower levels of activation exhibited positive, short-term benefits in increased intent and confidence to discuss their needs and ask questions in the clinic visit, positive attitudes regarding participation in SDM with their provider, and accurate beliefs about the use of antibiotics (P<.10). The results also suggest small immediate gains in providers’ attitudes about SDM (mean change 0.20; F1,33= 8.03, P=.01). Conclusions This pilot study provided preliminary evidence on the efficacy of the use of simulated conversations with virtual humans as a tool to improve patient-provider communication (ie, through increasing patient confidence to actively participate in the visit and physician attitudes about SDM) for engaging in conversations about antibiotic use. Future research should explore if repeated opportunities to use the 15-min simulation as well as providing users with several different conversations to practice with would result in sustained improvements in antibiotics beliefs and knowledge and communication behaviors over time. The results of this pilot study offered several opportunities to improve on the simulation in order to bolster communication skills and knowledge retention. PMID:28428160
A Parameter Tuning Scheme of Sea-ice Model Based on Automatic Differentiation Technique
NASA Astrophysics Data System (ADS)
Kim, J. G.; Hovland, P. D.
2001-05-01
Automatic diferentiation (AD) technique was used to illustrate a new approach for parameter tuning scheme of an uncoupled sea-ice model. Atmospheric forcing field of 1992 obtained from NCEP data was used as enforcing variables in the study. The simulation results were compared with the observed ice movement provided by the International Arctic Buoy Programme (IABP). All of the numerical experiments were based on a widely used dynamic and thermodynamic model for simulating the seasonal sea-ice chnage of the main Arctic ocean. We selected five dynamic and thermodynamic parameters for the tuning process in which the cost function defined by the norm of the difference between observed and simulated ice drift locations was minimized. The selected parameters are the air and ocean drag coefficients, the ice strength constant, the turning angle at ice-air/ocean interface, and the bulk sensible heat transfer coefficient. The drag coefficients were the major parameters to control sea-ice movement and extent. The result of the study shows that more realistic simulations of ice thickness distribution was produced by tuning the simulated ice drift trajectories. In the tuning process, the L-BFCGS-B minimization algorithm of a quasi-Newton method was used. The derivative information required in the minimization iterations was provided by the AD processed Fortran code. Compared with a conventional approach, AD generated derivative code provided fast and robust computations of derivative information.
Ortiz, Roderick F.; Miller, Lisa D.
2009-01-01
Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Southern Delivery System (SDS) project is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various Environmental Impact Statements (EIS) alternatives and plans by Pueblo West to discharge treated wastewater into the reservoir. Wastewater plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (year 2006 demand conditions) were compared to the No Action scenario (projected demands in 2046) to assess changes in water quality over time. All scenario modeling used an external nutrient-decay model to simulate degradation and assimilation of nutrients along the riverine reach upstream from Pueblo Reservoir. Reservoir modeling was conducted using the U.S. Army Corps of Engineers CE-QUAL-W2 two-dimensional water-quality model. Lake hydrodynamics, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, algal biomass, and total iron were simulated. Two reservoir site locations were selected for comparison. Results of simulations at site 3B were characteristic of a riverine environment in the reservoir, whereas results at site 7B (near the dam) were characteristic of the main body of the reservoir. Simulation results for the epilimnion and hypolimnion at these two sites also were evaluated and compared. The simulation results in the hypolimnion at site 7B were indicative of the water quality leaving the reservoir. Comparisons of the different scenario results were conducted to assess if substantial differences were observed between selected scenarios. Each of the scenarios was simulated for three contiguous years representing a wet, average, and dry annual hydrologic cycle (water years 2000 through 2002). Additionally, each selected simulation scenario was evaluated for differences in direct and cumulative effects on a particular scenario. Direct effects are intended to isolate the future effects of the scenarios. Cumulative effects are intended to evaluate the effects of the scenarios in conjunction with all reasonably foreseeable future activities in the study area. Comparisons between the direct- and cumulative-effects analyses indicated that there were not large differences in the results between most of the simulation scenarios, and, as such, the focus of this report was on results for the direct-effects analysis. Additionally, the differences between simulation results generally were
A data-driven dynamics simulation framework for railway vehicles
NASA Astrophysics Data System (ADS)
Nie, Yinyu; Tang, Zhao; Liu, Fengjia; Chang, Jian; Zhang, Jianjun
2018-03-01
The finite element (FE) method is essential for simulating vehicle dynamics with fine details, especially for train crash simulations. However, factors such as the complexity of meshes and the distortion involved in a large deformation would undermine its calculation efficiency. An alternative method, the multi-body (MB) dynamics simulation provides satisfying time efficiency but limited accuracy when highly nonlinear dynamic process is involved. To maintain the advantages of both methods, this paper proposes a data-driven simulation framework for dynamics simulation of railway vehicles. This framework uses machine learning techniques to extract nonlinear features from training data generated by FE simulations so that specific mesh structures can be formulated by a surrogate element (or surrogate elements) to replace the original mechanical elements, and the dynamics simulation can be implemented by co-simulation with the surrogate element(s) embedded into a MB model. This framework consists of a series of techniques including data collection, feature extraction, training data sampling, surrogate element building, and model evaluation and selection. To verify the feasibility of this framework, we present two case studies, a vertical dynamics simulation and a longitudinal dynamics simulation, based on co-simulation with MATLAB/Simulink and Simpack, and a further comparison with a popular data-driven model (the Kriging model) is provided. The simulation result shows that using the legendre polynomial regression model in building surrogate elements can largely cut down the simulation time without sacrifice in accuracy.
Microstructure simulation of rapidly solidified ASP30 high-speed steel particles by gas atomization
NASA Astrophysics Data System (ADS)
Ma, Jie; Wang, Bo; Yang, Zhi-liang; Wu, Guang-xin; Zhang, Jie-yu; Zhao, Shun-li
2016-03-01
In this study, the microstructure evolution of rapidly solidified ASP30 high-speed steel particles was predicted using a simulation method based on the cellular automaton-finite element (CAFE) model. The dendritic growth kinetics, in view of the characteristics of ASP30 steel, were calculated and combined with macro heat transfer calculations by user-defined functions (UDFs) to simulate the microstructure of gas-atomized particles. The relationship among particle diameter, undercooling, and the convection heat transfer coefficient was also investigated to provide cooling conditions for simulations. The simulated results indicated that a columnar grain microstructure was observed in small particles, whereas an equiaxed microstructure was observed in large particles. In addition, the morphologies and microstructures of gas-atomized ASP30 steel particles were also investigated experimentally using scanning electron microscopy (SEM). The experimental results showed that four major types of microstructures were formed: dendritic, equiaxed, mixed, and multi-droplet microstructures. The simulated results and the available experimental data are in good agreement.
Shot Peening Numerical Simulation of Aircraft Aluminum Alloy Structure
NASA Astrophysics Data System (ADS)
Liu, Yong; Lv, Sheng-Li; Zhang, Wei
2018-03-01
After shot peening, the 7050 aluminum alloy has good anti-fatigue and anti-stress corrosion properties. In the shot peening process, the pellet collides with target material randomly, and generated residual stress distribution on the target material surface, which has great significance to improve material property. In this paper, a simplified numerical simulation model of shot peening was established. The influence of pellet collision velocity, pellet collision position and pellet collision time interval on the residual stress of shot peening was studied, which is simulated by the ANSYS/LS-DYNA software. The analysis results show that different velocity, different positions and different time intervals have great influence on the residual stress after shot peening. Comparing with the numerical simulation results based on Kriging model, the accuracy of the simulation results in this paper was verified. This study provides a reference for the optimization of the shot peening process, and makes an effective exploration for the precise shot peening numerical simulation.
Validation of the SimSET simulation package for modeling the Siemens Biograph mCT PET scanner
NASA Astrophysics Data System (ADS)
Poon, Jonathan K.; Dahlbom, Magnus L.; Casey, Michael E.; Qi, Jinyi; Cherry, Simon R.; Badawi, Ramsey D.
2015-02-01
Monte Carlo simulation provides a valuable tool in performance assessment and optimization of system design parameters for PET scanners. SimSET is a popular Monte Carlo simulation toolkit that features fast simulation time, as well as variance reduction tools to further enhance computational efficiency. However, SimSET has lacked the ability to simulate block detectors until its most recent release. Our goal is to validate new features of SimSET by developing a simulation model of the Siemens Biograph mCT PET scanner and comparing the results to a simulation model developed in the GATE simulation suite and to experimental results. We used the NEMA NU-2 2007 scatter fraction, count rates, and spatial resolution protocols to validate the SimSET simulation model and its new features. The SimSET model overestimated the experimental results of the count rate tests by 11-23% and the spatial resolution test by 13-28%, which is comparable to previous validation studies of other PET scanners in the literature. The difference between the SimSET and GATE simulation was approximately 4-8% for the count rate test and approximately 3-11% for the spatial resolution test. In terms of computational time, SimSET performed simulations approximately 11 times faster than GATE simulations. The new block detector model in SimSET offers a fast and reasonably accurate simulation toolkit for PET imaging applications.
NASA Technical Reports Server (NTRS)
Donohue, Paul F.
1987-01-01
The results of an aerodynamic performance evaluation of the National Aeronautics and Space Administration (NASA)/Ames Research Center Advanced Concepts Flight Simulator (ACFS), conducted in association with the Navy-NASA Joint Institute of Aeronautics, are presented. The ACFS is a full-mission flight simulator which provides an excellent platform for the critical evaluation of emerging flight systems and aircrew performance. The propulsion and flight dynamics models were evaluated using classical flight test techniques. The aerodynamic performance model of the ACFS was found to realistically represent that of current day, medium range transport aircraft. Recommendations are provided to enhance the capabilities of the ACFS to a level forecast for 1995 transport aircraft. The graphical and tabular results of this study will establish a performance section of the ACFS Operation's Manual.
NASA Astrophysics Data System (ADS)
Koo, Cheol Hea; Lee, Hoon Hee; Moon, Sung Tae; Han, Sang Hyuck; Ju, Gwang Hyeok
2013-08-01
In aerospace research and practical development area, increasing the usage of simulation in software development, component design and system operation has been maintained and the increasing speed getting faster. This phenomenon can be found from the easiness of handling of simulation and the powerfulness of the output from the simulation. Simulation brings lots of benefit from the several characteristics of it as following, - easy to handle ; it is never broken or damaged by mistake - never wear out ; it is never getting old - cost effective ; once it is built, it can be distributed over 100 ~ 1000 people GenSim (Generic Simulator) which is developing by KARI and compatible with ESA SMP standard provides such a simulation platform to support flight software validation and mission operation verification. User interface of GenSim is shown in Figure 1 [1,2]. As shown in Figure 1, as most simulation platform typically has, GenSim has GRD (Graphical Display) and AND (Alpha Numeric Display). But frequently more complex and powerful handling of the simulated data is required at the actual system validation for example mission operation. In Figure 2, system simulation result of COMS (Communication, Ocean, and Meteorological Satellite, launched at June 28 2008) is being drawn by Celestia 3D program. In this case, the needed data from Celestia is given by one of the simulation model resident in system simulator through UDP network connection in this case. But the requirement of displaying format, data size, and communication rate is variable so developer has to manage the connection protocol manually at each time and each case. It brings a chaos in the simulation model design and development, also to the performance issue at last. Performance issue is happen when the required data magnitude is higher than the capacity of simulation kernel to process the required data safely. The problem is that the sending data to a visualization tool such as celestia is given by a simulation model not kernel. Because the simulation model has no way to know about the status of simulation kernel load to process simulation events, as the result the simulation model sends the data as frequent as needed. This story may make many potential problems like lack of response, failure of meeting deadline and data integrity problem with the model data during the simulation. SIMSAT and EuroSim gives a warning message if the user request event such as printing log can't be processed as planned or requested. As the consequence the requested event will be delayed or not be able to be processed, and it means that this phenomenon may violate the planned deadline. In most soft real time simulation, this can be neglected and just make a little inconvenience of users. But it shall be noted that if the user request is not managed properly at some critical situation, the simulation results may be ended with a mess and chaos. As we traced the disadvantages of what simulation model provide the user request, simulation model is not appropriate to provide a service for such user request. This kind of work shall be minimized as much as possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Im, Piljae; Cho, Heejin; Kim, Dongsu
2016-08-01
This report provides second-year project simulation results for the multi-year project titled “Evaluation of Variable Refrigeration Flow (VRF) system on Oak Ridge National Laboratory (ORNL)’s Flexible Research Platform (FRP).”
A Global, Multi-Waveband Model for the Zodiacal Cloud
NASA Technical Reports Server (NTRS)
Grogan, Keith; Dermott, Stanley F.; Kehoe, Thomas J. J.
2003-01-01
This recently completed three-year project was undertaken by the PI at the University of Florida, NASA Goddard and JPL, and by the Co-I and Collaborator at the University of Florida. The funding was used to support a continuation of research conducted at the University of Florida over the last decade which focuses on the dynamics of dust particles in the interplanetary environment. The main objectives of this proposal were: To produce improved dynamical models of the zodiacal cloud by performing numerical simulations of the orbital evolution of asteroidal and cometary dust particles. To provide visualizations of the results using our visualization software package, SIMUL, simulating the viewing geometries of IRAS and COBE and comparing the model results with archived data. To use the results to provide a more accurate model of the brightness distribution of the zodiacal cloud than existing empirical models. In addition, our dynamical approach can provide insight into fundamental properties of the cloud, including but not limited to the total mass and surface area of dust, the size-frequency distribution of dust, and the relative contributions of asteroidal and cometary material. The model can also be used to provide constraints on trace signals from other sources, such as dust associated with the "Plutinos" , objects captured in the 2:3 resonance with Neptune.
Simulation based planning of surgical interventions in pediatric cardiology
NASA Astrophysics Data System (ADS)
Marsden, Alison L.
2013-10-01
Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.
Leadership Development Through Peer-Facilitated Simulation in Nursing Education.
Brown, Karen M; Rode, Jennifer L
2018-01-01
Baccalaureate nursing graduates must possess leadership skills, yet few opportunities exist to cultivate leadership abilities in a clinical environment. Peer-facilitated learning may increase the leadership skills of competence, self-confidence, self-reflection, and role modeling. Facilitating human patient simulation provides opportunities to develop leadership skills. With faculty supervision, senior baccalaureate students led small-group simulation experiences with sophomore and junior peers and then conducted subsequent debriefings. Quantitative and qualitative descriptive data allowed evaluation of students' satisfaction with this teaching innovation and whether the experience affected students' desire to take on leadership roles. Students expressed satisfaction with the peer-facilitated simulation experience and confidence in mastering the content while developing necessary skills for practice. Peer-facilitated simulation provides an opportunity for leadership development and learning. Study results can inform the development of nursing curricula to best develop the leadership skills of nursing students. [J Nurs Educ. 2018;57(1):53-57.]. Copyright 2018, SLACK Incorporated.
Theory and Simulation of Multicomponent Osmotic Systems
Karunaweera, Sadish; Gee, Moon Bae; Weerasinghe, Samantha; Smith, Paul E.
2012-01-01
Most cellular processes occur in systems containing a variety of components many of which are open to material exchange. However, computer simulations of biological systems are almost exclusively performed in systems closed to material exchange. In principle, the behavior of biomolecules in open and closed systems will be different. Here, we provide a rigorous framework for the analysis of experimental and simulation data concerning open and closed multicomponent systems using the Kirkwood-Buff (KB) theory of solutions. The results are illustrated using computer simulations for various concentrations of the solutes Gly, Gly2 and Gly3 in both open and closed systems, and in the absence or presence of NaCl as a cosolvent. In addition, KB theory is used to help rationalize the aggregation properties of the solutes. Here one observes that the picture of solute association described by the KB integrals, which are directly related to the solution thermodynamics, and that provided by more physical clustering approaches are different. It is argued that the combination of KB theory and simulation data provides a simple and powerful tool for the analysis of complex multicomponent open and closed systems. PMID:23329894
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kochunas, Brendan; Collins, Benjamin; Stimpson, Shane
This paper describes the methodology developed and implemented in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS) to perform high-fidelity, pressurized water reactor (PWR), multicycle, core physics calculations. Depletion of the core with pin-resolved power and nuclide detail is a significant advance in the state of the art for reactor analysis, providing the level of detail necessary to address the problems of the U.S. Department of Energy Nuclear Reactor Simulation Hub, the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS has three main components: the neutronics solver MPACT, the thermal-hydraulic (T-H) solver COBRA-TF (CTF), and the nuclidemore » transmutation solver ORIGEN. This paper focuses on MPACT and provides an overview of the resonance self-shielding methods, macroscopic-cross-section calculation, two-dimensional/one-dimensional (2-D/1-D) transport, nuclide depletion, T-H feedback, and other supporting methods representing a minimal set of the capabilities needed to simulate high-fidelity models of a commercial nuclear reactor. Results are presented from the simulation of a model of the first cycle of Watts Bar Unit 1. The simulation is within 16 parts per million boron (ppmB) reactivity for all state points compared to cycle measurements, with an average reactivity bias of <5 ppmB for the entire cycle. Comparisons to cycle 1 flux map data are also provided, and the average 2-D root-mean-square (rms) error during cycle 1 is 1.07%. To demonstrate the multicycle capability, a state point at beginning of cycle (BOC) 2 was also simulated and compared to plant data. The comparison of the cycle 2 BOC state has a reactivity difference of +3 ppmB from measurement, and the 2-D rms of the comparison in the flux maps is 1.77%. Lastly, these results provide confidence in VERA-CS’s capability to perform high-fidelity calculations for practical PWR reactor problems.« less
McDonald, Richard R.; Nelson, Jonathan M.; Fosness, Ryan L.; Nelson, Peter O.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan
2016-01-01
Two- and three-dimensional morphodynamic simulations are becoming common in studies of channel form and process. The performance of these simulations are often validated against measurements from laboratory studies. Collecting channel change information in natural settings for model validation is difficult because it can be expensive and under most channel forming flows the resulting channel change is generally small. Several channel restoration projects designed in part to armor large meanders with several large spurs constructed of wooden piles on the Kootenai River, ID, have resulted in rapid bed elevation change following construction. Monitoring of these restoration projects includes post- restoration (as-built) Digital Elevation Models (DEMs) as well as additional channel surveys following high channel forming flows post-construction. The resulting sequence of measured bathymetry provides excellent validation data for morphodynamic simulations at the reach scale of a real river. In this paper we test the performance a quasi-three-dimensional morphodynamic simulation against the measured elevation change. The resulting simulations predict the pattern of channel change reasonably well but many of the details such as the maximum scour are under predicted.
NiftySim: A GPU-based nonlinear finite element package for simulation of soft tissue biomechanics.
Johnsen, Stian F; Taylor, Zeike A; Clarkson, Matthew J; Hipwell, John; Modat, Marc; Eiben, Bjoern; Han, Lianghao; Hu, Yipeng; Mertzanidou, Thomy; Hawkes, David J; Ourselin, Sebastien
2015-07-01
NiftySim, an open-source finite element toolkit, has been designed to allow incorporation of high-performance soft tissue simulation capabilities into biomedical applications. The toolkit provides the option of execution on fast graphics processing unit (GPU) hardware, numerous constitutive models and solid-element options, membrane and shell elements, and contact modelling facilities, in a simple to use library. The toolkit is founded on the total Lagrangian explicit dynamics (TLEDs) algorithm, which has been shown to be efficient and accurate for simulation of soft tissues. The base code is written in C[Formula: see text], and GPU execution is achieved using the nVidia CUDA framework. In most cases, interaction with the underlying solvers can be achieved through a single Simulator class, which may be embedded directly in third-party applications such as, surgical guidance systems. Advanced capabilities such as contact modelling and nonlinear constitutive models are also provided, as are more experimental technologies like reduced order modelling. A consistent description of the underlying solution algorithm, its implementation with a focus on GPU execution, and examples of the toolkit's usage in biomedical applications are provided. Efficient mapping of the TLED algorithm to parallel hardware results in very high computational performance, far exceeding that available in commercial packages. The NiftySim toolkit provides high-performance soft tissue simulation capabilities using GPU technology for biomechanical simulation research applications in medical image computing, surgical simulation, and surgical guidance applications.
Calculations of a wideband metamaterial absorber using equivalent medium theory
NASA Astrophysics Data System (ADS)
Huang, Xiaojun; Yang, Helin; Wang, Danqi; Yu, Shengqing; Lou, Yanchao; Guo, Ling
2016-08-01
Metamaterial absorbers (MMAs) have drawn increasing attention in many areas due to the fact that they can achieve electromagnetic (EM) waves with unity absorptivity. We demonstrate the design, simulation, experiment and calculation of a wideband MMA based on a loaded double-square-loop (DSL) array of chip resisters. For a normal incidence EM wave, the simulated results show that the absorption of the full width at half maximum is about 9.1 GHz, and the relative bandwidth is 87.1%. Experimental results are in agreement with the simulations. More importantly, equivalent medium theory (EMT) is utilized to calculate the absorptions of the DSL MMA, and the calculated absorptions based on EMT agree with the simulated and measured results. The method based on EMT provides a new way to analysis the mechanism of MMAs.
Building occupancy simulation and data assimilation using a graph-based agent-oriented model
NASA Astrophysics Data System (ADS)
Rai, Sanish; Hu, Xiaolin
2018-07-01
Building occupancy simulation and estimation simulates the dynamics of occupants and estimates their real-time spatial distribution in a building. It requires a simulation model and an algorithm for data assimilation that assimilates real-time sensor data into the simulation model. Existing building occupancy simulation models include agent-based models and graph-based models. The agent-based models suffer high computation cost for simulating large numbers of occupants, and graph-based models overlook the heterogeneity and detailed behaviors of individuals. Recognizing the limitations of existing models, this paper presents a new graph-based agent-oriented model which can efficiently simulate large numbers of occupants in various kinds of building structures. To support real-time occupancy dynamics estimation, a data assimilation framework based on Sequential Monte Carlo Methods is also developed and applied to the graph-based agent-oriented model to assimilate real-time sensor data. Experimental results show the effectiveness of the developed model and the data assimilation framework. The major contributions of this work are to provide an efficient model for building occupancy simulation that can accommodate large numbers of occupants and an effective data assimilation framework that can provide real-time estimations of building occupancy from sensor data.
NASA Handbook for Models and Simulations: An Implementation Guide for NASA-STD-7009
NASA Technical Reports Server (NTRS)
Steele, Martin J.
2013-01-01
The purpose of this Handbook is to provide technical information, clarification, examples, processes, and techniques to help institute good modeling and simulation practices in the National Aeronautics and Space Administration (NASA). As a companion guide to NASA-STD- 7009, Standard for Models and Simulations, this Handbook provides a broader scope of information than may be included in a Standard and promotes good practices in the production, use, and consumption of NASA modeling and simulation products. NASA-STD-7009 specifies what a modeling and simulation activity shall or should do (in the requirements) but does not prescribe how the requirements are to be met, which varies with the specific engineering discipline, or who is responsible for complying with the requirements, which depends on the size and type of project. A guidance document, which is not constrained by the requirements of a Standard, is better suited to address these additional aspects and provide necessary clarification. This Handbook stems from the Space Shuttle Columbia Accident Investigation (2003), which called for Agency-wide improvements in the "development, documentation, and operation of models and simulations"' that subsequently elicited additional guidance from the NASA Office of the Chief Engineer to include "a standard method to assess the credibility of the models and simulations."2 General methods applicable across the broad spectrum of model and simulation (M&S) disciplines were sought to help guide the modeling and simulation processes within NASA and to provide for consistent reporting ofM&S activities and analysis results. From this, the standardized process for the M&S activity was developed. The major contents of this Handbook are the implementation details of the general M&S requirements ofNASA-STD-7009, including explanations, examples, and suggestions for improving the credibility assessment of an M&S-based analysis.
Yang, Kyeongra; Woomer, Gail Ratliff; Agbemenu, Kafuli; Williams, Lynne
2014-11-01
The study aim was to evaluate the effectiveness of a poverty simulation in increasing understanding of and attitudes toward poverty and resulting in changes in clinical practice among nursing seniors. A poverty simulation was conducted using a diverse group of nursing professors and staff from local community agencies assuming the role of community resource providers. Students were assigned roles as members of low-income families and were required to complete tasks during a simulated month. A debriefing was held after the simulation to explore students' experiences in a simulated poverty environment. Students' understanding of and attitude toward poverty pre- and post-simulation were examined. Changes in the students' clinical experiences following the simulation were summarized into identified categories and themes. The poverty simulation led to a greater empathy for the possible experiences of low income individuals and families, understanding of barriers to health care, change in attitudes towards poverty and to those living in poverty, and changes in the students' nursing practice. Use of poverty simulation is an effective means to teach nursing students about the experience of living in poverty. The simulation experience changed nursing students' clinical practice, with students providing community referrals and initiating inter-professional collaborations. Copyright © 2014 Elsevier Ltd. All rights reserved.
König, Gerhard; Miller, Benjamin T; Boresch, Stefan; Wu, Xiongwu; Brooks, Bernard R
2012-10-09
One of the key requirements for the accurate calculation of free energy differences is proper sampling of conformational space. Especially in biological applications, molecular dynamics simulations are often confronted with rugged energy surfaces and high energy barriers, leading to insufficient sampling and, in turn, poor convergence of the free energy results. In this work, we address this problem by employing enhanced sampling methods. We explore the possibility of using self-guided Langevin dynamics (SGLD) to speed up the exploration process in free energy simulations. To obtain improved free energy differences from such simulations, it is necessary to account for the effects of the bias due to the guiding forces. We demonstrate how this can be accomplished for the Bennett's acceptance ratio (BAR) and the enveloping distribution sampling (EDS) methods. While BAR is considered among the most efficient methods available for free energy calculations, the EDS method developed by Christ and van Gunsteren is a promising development that reduces the computational costs of free energy calculations by simulating a single reference state. To evaluate the accuracy of both approaches in connection with enhanced sampling, EDS was implemented in CHARMM. For testing, we employ benchmark systems with analytical reference results and the mutation of alanine to serine. We find that SGLD with reweighting can provide accurate results for BAR and EDS where conventional molecular dynamics simulations fail. In addition, we compare the performance of EDS with other free energy methods. We briefly discuss the implications of our results and provide practical guidelines for conducting free energy simulations with SGLD.
Simulation of a G-tolerance curve using the pulsatile cardiovascular model
NASA Technical Reports Server (NTRS)
Solomon, M.; Srinivasan, R.
1985-01-01
A computer simulation study, performed to assess the ability of the cardiovascular model to reproduce the G tolerance curve (G level versus tolerance time) is reported. A composite strength duration curve derived from experimental data obtained in human centrifugation studies was used for comparison. The effects of abolishing automomic control and of blood volume loss on G tolerance were also simulated. The results provide additional validation of the model. The need for the presence of autonomic reflexes even at low levels of G is pointed out. The low margin of safety with a loss of blood volume indicated by the simulation results underscores the necessity for protective measures during Shuttle reentry.
Simulation of FRET dyes allows quantitative comparison against experimental data
NASA Astrophysics Data System (ADS)
Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander
2018-03-01
Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.
Simulation-Based Analysis of Reentry Dynamics for the Sharp Atmospheric Entry Vehicle
NASA Technical Reports Server (NTRS)
Tillier, Clemens Emmanuel
1998-01-01
This thesis describes the analysis of the reentry dynamics of a high-performance lifting atmospheric entry vehicle through numerical simulation tools. The vehicle, named SHARP, is currently being developed by the Thermal Protection Materials and Systems branch of NASA Ames Research Center, Moffett Field, California. The goal of this project is to provide insight into trajectory tradeoffs and vehicle dynamics using simulation tools that are powerful, flexible, user-friendly and inexpensive. Implemented Using MATLAB and SIMULINK, these tools are developed with an eye towards further use in the conceptual design of the SHARP vehicle's trajectory and flight control systems. A trajectory simulator is used to quantify the entry capabilities of the vehicle subject to various operational constraints. Using an aerodynamic database computed by NASA and a model of the earth, the simulator generates the vehicle trajectory in three-dimensional space based on aerodynamic angle inputs. Requirements for entry along the SHARP aerothermal performance constraint are evaluated for different control strategies. Effect of vehicle mass on entry parameters is investigated, and the cross range capability of the vehicle is evaluated. Trajectory results are presented and interpreted. A six degree of freedom simulator builds on the trajectory simulator and provides attitude simulation for future entry controls development. A Newtonian aerodynamic model including control surfaces and a mass model are developed. A visualization tool for interpreting simulation results is described. Control surfaces are roughly sized. A simple controller is developed to fly the vehicle along its aerothermal performance constraint using aerodynamic flaps for control. This end-to-end demonstration proves the suitability of the 6-DOF simulator for future flight control system development. Finally, issues surrounding real-time simulation with hardware in the loop are discussed.
TH-E-18A-01: Developments in Monte Carlo Methods for Medical Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badal, A; Zbijewski, W; Bolch, W
Monte Carlo simulation methods are widely used in medical physics research and are starting to be implemented in clinical applications such as radiation therapy planning systems. Monte Carlo simulations offer the capability to accurately estimate quantities of interest that are challenging to measure experimentally while taking into account the realistic anatomy of an individual patient. Traditionally, practical application of Monte Carlo simulation codes in diagnostic imaging was limited by the need for large computational resources or long execution times. However, recent advancements in high-performance computing hardware, combined with a new generation of Monte Carlo simulation algorithms and novel postprocessing methods,more » are allowing for the computation of relevant imaging parameters of interest such as patient organ doses and scatter-to-primaryratios in radiographic projections in just a few seconds using affordable computational resources. Programmable Graphics Processing Units (GPUs), for example, provide a convenient, affordable platform for parallelized Monte Carlo executions that yield simulation times on the order of 10{sup 7} xray/ s. Even with GPU acceleration, however, Monte Carlo simulation times can be prohibitive for routine clinical practice. To reduce simulation times further, variance reduction techniques can be used to alter the probabilistic models underlying the x-ray tracking process, resulting in lower variance in the results without biasing the estimates. Other complementary strategies for further reductions in computation time are denoising of the Monte Carlo estimates and estimating (scoring) the quantity of interest at a sparse set of sampling locations (e.g. at a small number of detector pixels in a scatter simulation) followed by interpolation. Beyond reduction of the computational resources required for performing Monte Carlo simulations in medical imaging, the use of accurate representations of patient anatomy is crucial to the virtual generation of medical images and accurate estimation of radiation dose and other imaging parameters. For this, detailed computational phantoms of the patient anatomy must be utilized and implemented within the radiation transport code. Computational phantoms presently come in one of three format types, and in one of four morphometric categories. Format types include stylized (mathematical equation-based), voxel (segmented CT/MR images), and hybrid (NURBS and polygon mesh surfaces). Morphometric categories include reference (small library of phantoms by age at 50th height/weight percentile), patient-dependent (larger library of phantoms at various combinations of height/weight percentiles), patient-sculpted (phantoms altered to match the patient's unique outer body contour), and finally, patient-specific (an exact representation of the patient with respect to both body contour and internal anatomy). The existence and availability of these phantoms represents a very important advance for the simulation of realistic medical imaging applications using Monte Carlo methods. New Monte Carlo simulation codes need to be thoroughly validated before they can be used to perform novel research. Ideally, the validation process would involve comparison of results with those of an experimental measurement, but accurate replication of experimental conditions can be very challenging. It is very common to validate new Monte Carlo simulations by replicating previously published simulation results of similar experiments. This process, however, is commonly problematic due to the lack of sufficient information in the published reports of previous work so as to be able to replicate the simulation in detail. To aid in this process, the AAPM Task Group 195 prepared a report in which six different imaging research experiments commonly performed using Monte Carlo simulations are described and their results provided. The simulation conditions of all six cases are provided in full detail, with all necessary data on material composition, source, geometry, scoring and other parameters provided. The results of these simulations when performed with the four most common publicly available Monte Carlo packages are also provided in tabular form. The Task Group 195 Report will be useful for researchers needing to validate their Monte Carlo work, and for trainees needing to learn Monte Carlo simulation methods. In this symposium we will review the recent advancements in highperformance computing hardware enabling the reduction in computational resources needed for Monte Carlo simulations in medical imaging. We will review variance reduction techniques commonly applied in Monte Carlo simulations of medical imaging systems and present implementation strategies for efficient combination of these techniques with GPU acceleration. Trade-offs involved in Monte Carlo acceleration by means of denoising and “sparse sampling” will be discussed. A method for rapid scatter correction in cone-beam CT (<5 min/scan) will be presented as an illustration of the simulation speeds achievable with optimized Monte Carlo simulations. We will also discuss the development, availability, and capability of the various combinations of computational phantoms for Monte Carlo simulation of medical imaging systems. Finally, we will review some examples of experimental validation of Monte Carlo simulations and will present the AAPM Task Group 195 Report. Learning Objectives: Describe the advances in hardware available for performing Monte Carlo simulations in high performance computing environments. Explain variance reduction, denoising and sparse sampling techniques available for reduction of computational time needed for Monte Carlo simulations of medical imaging. List and compare the computational anthropomorphic phantoms currently available for more accurate assessment of medical imaging parameters in Monte Carlo simulations. Describe experimental methods used for validation of Monte Carlo simulations in medical imaging. Describe the AAPM Task Group 195 Report and its use for validation and teaching of Monte Carlo simulations in medical imaging.« less
Quantum Dot Detectors with Plasmonic Structures
2015-05-15
plasmon polariton mode and a guided Fabry-Perot mode. The simulation method accomplished in this paper provides a generalized approach to optimize the...plasmon polariton (SPP) mode and a guided Fabry-Perot mode, that enhance x or y (along the polarization direction used in simulation) and z (along the...resulting from surface plasmon polariton and guided Fabry-Perot modes) are shown in the inset to Fig. 3. This figure also shows the simulated
Integrating Telepresence Robots Into Nursing Simulation.
Rudolph, Alexandra; Vaughn, Jacqueline; Crego, Nancy; Hueckel, Remi; Kuszajewski, Michele; Molloy, Margory; Brisson, Raymond; Shaw, Ryan J
This article provides an overview of the use of telepresence robots in clinical practice and describes an evaluation of an educational project in which distance-based nurse practitioner students used telepresence robots in clinical simulations with on-campus Accelerated Bachelor of Science in Nursing students. The results of this project suggest that the incorporation of telepresence in simulation is an effective method to promote engagement, satisfaction, and self-confidence in learning.
ERIC Educational Resources Information Center
Budd, Mary-Jane; Hanley, J. Richard; Griffiths, Yvonne
2011-01-01
This study investigated whether Foygel and Dell's (2000) interactive two-step model of speech production could simulate the number and type of errors made in picture-naming by 68 children of elementary-school age. Results showed that the model provided a satisfactory simulation of the mean error profile of children aged five, six, seven, eight and…
Cloud-based simulations on Google Exacycle reveal ligand modulation of GPCR activation pathways
NASA Astrophysics Data System (ADS)
Kohlhoff, Kai J.; Shukla, Diwakar; Lawrenz, Morgan; Bowman, Gregory R.; Konerding, David E.; Belov, Dan; Altman, Russ B.; Pande, Vijay S.
2014-01-01
Simulations can provide tremendous insight into the atomistic details of biological mechanisms, but micro- to millisecond timescales are historically only accessible on dedicated supercomputers. We demonstrate that cloud computing is a viable alternative that brings long-timescale processes within reach of a broader community. We used Google's Exacycle cloud-computing platform to simulate two milliseconds of dynamics of a major drug target, the G-protein-coupled receptor β2AR. Markov state models aggregate independent simulations into a single statistical model that is validated by previous computational and experimental results. Moreover, our models provide an atomistic description of the activation of a G-protein-coupled receptor and reveal multiple activation pathways. Agonists and inverse agonists interact differentially with these pathways, with profound implications for drug design.
Modeling and Simulation at NASA
NASA Technical Reports Server (NTRS)
Steele, Martin J.
2009-01-01
This slide presentation is composed of two topics. The first reviews the use of modeling and simulation (M&S) particularly as it relates to the Constellation program and discrete event simulation (DES). DES is defined as a process and system analysis, through time-based and resource constrained probabilistic simulation models, that provide insight into operation system performance. The DES shows that the cycles for a launch from manufacturing and assembly to launch and recovery is about 45 days and that approximately 4 launches per year are practicable. The second topic reviews a NASA Standard for Modeling and Simulation. The Columbia Accident Investigation Board made some recommendations related to models and simulations. Some of the ideas inherent in the new standard are the documentation of M&S activities, an assessment of the credibility, and reporting to decision makers, which should include the analysis of the results, a statement as to the uncertainty in the results,and the credibility of the results. There is also discussion about verification and validation (V&V) of models. There is also discussion about the different types of models and simulation.
Simulating the X-Ray Image Contrast to Set-Up Techniques with Desired Flaw Detectability
NASA Technical Reports Server (NTRS)
Koshti, Ajay M.
2015-01-01
The paper provides simulation data of previous work by the author in developing a model for estimating detectability of crack-like flaws in radiography. The methodology is being developed to help in implementation of NASA Special x-ray radiography qualification, but is generically applicable to radiography. The paper describes a method for characterizing X-ray detector resolution for crack detection. Applicability of ASTM E 2737 resolution requirements to the model are also discussed. The paper describes a model for simulating the detector resolution. A computer calculator application, discussed here, also performs predicted contrast and signal-to-noise ratio calculations. Results of various simulation runs in calculating x-ray flaw size parameter and image contrast for varying input parameters such as crack depth, crack width, part thickness, x-ray angle, part-to-detector distance, part-to-source distance, source sizes, and detector sensitivity and resolution are given as 3D surfaces. These results demonstrate effect of the input parameters on the flaw size parameter and the simulated image contrast of the crack. These simulations demonstrate utility of the flaw size parameter model in setting up x-ray techniques that provide desired flaw detectability in radiography. The method is applicable to film radiography, computed radiography, and digital radiography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zirnstein, E. J.; Heerikhuisen, J.; McComas, D. J.
The Interstellar Boundary EXplorer (IBEX), launched in 2008 October, has improved our understanding of the solar wind-local interstellar medium interaction through its detection of neutral atoms, particularly that of hydrogen (H). IBEX is able to create full maps of the sky in six-month intervals as the Earth orbits the Sun, detecting H with energies between ∼0.01 and 6 keV. Due to the relative motion of IBEX to the solar inertial frame, measurements made in the spacecraft frame introduce a Compton-Getting (CG) effect, complicating measurements at the lowest energies. In this paper we provide results from a numerical simulation that calculatesmore » fluxes of H atoms at 1 AU in the inertial and spacecraft frames (both ram and anti-ram), at energies relevant to IBEX-Hi and -Lo. We show theory behind the numerical simulations, applying a simple frame transformation to derived flux equations that provides a straightforward way to simulate fluxes in the spacecraft frame. We then show results of H energetic neutral atom fluxes simulated at IBEX-Hi energy passbands 2-6 in all frames, comparing with IBEX-Hi data along selected directions, and also show results simulated at energies relevant to IBEX-Lo. Although simulations at IBEX-Hi energies agree reasonably well with the CG correction method used for IBEX-Hi data, we demonstrate the importance of properly modeling low energy H fluxes due to inherent complexities involved with measurements made in moving frames, as well as dynamic radiation pressure effects close to the Sun.« less
Modeling flash floods in southern France for road management purposes
NASA Astrophysics Data System (ADS)
Vincendon, Béatrice; Édouard, Simon; Dewaele, Hélène; Ducrocq, Véronique; Lespinas, Franck; Delrieu, Guy; Anquetin, Sandrine
2016-10-01
Flash-floods are among the most devastating hazards in the Mediterranean. A major subset of damage and casualties caused by flooding is related to road submersion. Distributed hydrological nowcasting can be used for road flooding monitoring. This requires rainfall-runoff simulations at a high space and time resolution. Distributed hydrological models, such as the ISBA-TOP coupled system used in this study, are designed to simulate discharges for any cross-section of a river but they are generally calibrated for certain outlets and give deteriorated results for the sub-catchment outlets. The paper first analyses ISBA-TOP discharge simulations in the French Mediterranean region for target points different from the outlets used for calibration. The sensitivity of the model to its governing factors is examined to highlight the validity of results obtained for ungauged river sections compared with those obtained for the main gauged outlets. The use of improved model inputs is found beneficial for sub-catchments simulation. The calibration procedure however provides the parameters' values for the main outlets only and these choices influence the simulations for ungauged catchments or sub-catchments. As a result, a new version of ISBA-TOP system without any parameter to calibrate is used to produce diagnostics relevant for quantifying the risk of road submersion. A first diagnostic is the simulated runoff spatial distribution, it provides a useful information about areas with a high risk of submersion. Then an indicator of the flood severity is given by simulated discharges presented with respect to return periods. The latter has to be used together with information about the vulnerability of road-river cross-sections.
Realistic natural atmospheric phenomena and weather effects for interactive virtual environments
NASA Astrophysics Data System (ADS)
McLoughlin, Leigh
Clouds and the weather are important aspects of any natural outdoor scene, but existing dynamic techniques within computer graphics only offer the simplest of cloud representations. The problem that this work looks to address is how to provide a means of simulating clouds and weather features such as precipitation, that are suitable for virtual environments. Techniques for cloud simulation are available within the area of meteorology, but numerical weather prediction systems are computationally expensive, give more numerical accuracy than we require for graphics and are restricted to the laws of physics. Within computer graphics, we often need to direct and adjust physical features or to bend reality to meet artistic goals, which is a key difference between the subjects of computer graphics and physical science. Pure physically-based simulations, however, evolve their solutions according to pre-set rules and are notoriously difficult to control. The challenge then is for the solution to be computationally lightweight and able to be directed in some measure while at the same time producing believable results. This work presents a lightweight physically-based cloud simulation scheme that simulates the dynamic properties of cloud formation and weather effects. The system simulates water vapour, cloud water, cloud ice, rain, snow and hail. The water model incorporates control parameters and the cloud model uses an arbitrary vertical temperature profile, with a tool described to allow the user to define this. The result of this work is that clouds can now be simulated in near real-time complete with precipitation. The temperature profile and tool then provide a means of directing the resulting formation..
NASA Astrophysics Data System (ADS)
Kuznetsova, M. M.; Liu, Y. H.; Rastaetter, L.; Pembroke, A. D.; Chen, L. J.; Hesse, M.; Glocer, A.; Komar, C. M.; Dorelli, J.; Roytershteyn, V.
2016-12-01
The presentation will provide overview of new tools, services and models implemented at the Community Coordinated Modeling Center (CCMC) to facilitate MMS dayside results analysis. We will provide updates on implementation of Particle-in-Cell (PIC) simulations at the CCMC and opportunities for on-line visualization and analysis of results of PIC simulations of asymmetric magnetic reconnection for different guide fields and boundary conditions. Fields, plasma parameters, particle distribution moments as well as particle distribution functions calculated in selected regions of the vicinity of reconnection sites can be analyzed through the web-based interactive visualization system. In addition there are options to request distribution functions in user selected regions of interest and to fly through simulated magnetic reconnection configurations and a map of distributions to facilitate comparisons with observations. A broad collection of global magnetosphere models hosted at the CCMC provide opportunity to put MMS observations and local PIC simulations into global context. We recently implemented the RECON-X post processing tool (Glocer et al, 2016) which allows users to determine the location of separator surface around closed field lines and between open field lines and solar wind field lines. The tool also finds the separatrix line where the two surfaces touch and positions of magnetic nulls. The surfaces and the separatrix line can be visualized relative to satellite positions in the dayside magnetosphere using an interactive HTML-5 visualization for each time step processed. To validate global magnetosphere models' capability to simulate locations of dayside magnetosphere boundaries we will analyze the proximity of MMS to simulated separatrix locations for a set of MMS diffusion region crossing events.
NASA Astrophysics Data System (ADS)
Croce, Olivier; Hachem, Sabet; Franchisseur, Eric; Marcié, Serge; Gérard, Jean-Pierre; Bordy, Jean-Marc
2012-06-01
This paper presents a dosimetric study concerning the system named "Papillon 50" used in the department of radiotherapy of the Centre Antoine-Lacassagne, Nice, France. The machine provides a 50 kVp X-ray beam, currently used to treat rectal cancers. The system can be mounted with various applicators of different diameters or shapes. These applicators can be fixed over the main rod tube of the unit in order to deliver the prescribed absorbed dose into the tumor with an optimal distribution. We have analyzed depth dose curves and dose profiles for the naked tube and for a set of three applicators. Dose measurements were made with an ionization chamber (PTW type 23342) and Gafchromic films (EBT2). We have also compared the measurements with simulations performed using the Monte Carlo code PENELOPE. Simulations were performed with a detailed geometrical description of the experimental setup and with enough statistics. Results of simulations are made in accordance with experimental measurements and provide an accurate evaluation of the dose delivered. The depths of the 50% isodose in water for the various applicators are 4.0, 6.0, 6.6 and 7.1 mm. The Monte Carlo PENELOPE simulations are in accordance with the measurements for a 50 kV X-ray system. Simulations are able to confirm the measurements provided by Gafchromic films or ionization chambers. Results also demonstrate that Monte Carlo simulations could be helpful to validate the future applicators designed for other localizations such as breast or skin cancers. Furthermore, Monte Carlo simulations could be a reliable alternative for a rapid evaluation of the dose delivered by such a system that uses multiple designs of applicators.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Assessment of Chlorophyll-a Algorithms Considering Different Trophic Statuses and Optimal Bands.
Salem, Salem Ibrahim; Higa, Hiroto; Kim, Hyungjun; Kobayashi, Hiroshi; Oki, Kazuo; Oki, Taikan
2017-07-31
Numerous algorithms have been proposed to retrieve chlorophyll- a concentrations in Case 2 waters; however, the retrieval accuracy is far from satisfactory. In this research, seven algorithms are assessed with different band combinations of multispectral and hyperspectral bands using linear (LN), quadratic polynomial (QP) and power (PW) regression approaches, resulting in altogether 43 algorithmic combinations. These algorithms are evaluated by using simulated and measured datasets to understand the strengths and limitations of these algorithms. Two simulated datasets comprising 500,000 reflectance spectra each, both based on wide ranges of inherent optical properties (IOPs), are generated for the calibration and validation stages. Results reveal that the regression approach (i.e., LN, QP, and PW) has more influence on the simulated dataset than on the measured one. The algorithms that incorporated linear regression provide the highest retrieval accuracy for the simulated dataset. Results from simulated datasets reveal that the 3-band (3b) algorithm that incorporate 665-nm and 680-nm bands and band tuning selection approach outperformed other algorithms with root mean square error (RMSE) of 15.87 mg·m -3 , 16.25 mg·m -3 , and 19.05 mg·m -3 , respectively. The spatial distribution of the best performing algorithms, for various combinations of chlorophyll- a (Chla) and non-algal particles (NAP) concentrations, show that the 3b_tuning_QP and 3b_680_QP outperform other algorithms in terms of minimum RMSE frequency of 33.19% and 60.52%, respectively. However, the two algorithms failed to accurately retrieve Chla for many combinations of Chla and NAP, particularly for low Chla and NAP concentrations. In addition, the spatial distribution emphasizes that no single algorithm can provide outstanding accuracy for Chla retrieval and that multi-algorithms should be included to reduce the error. Comparing the results of the measured and simulated datasets reveal that the algorithms that incorporate the 665-nm band outperform other algorithms for measured dataset (RMSE = 36.84 mg·m -3 ), while algorithms that incorporate the band tuning approach provide the highest retrieval accuracy for the simulated dataset (RMSE = 25.05 mg·m -3 ).
Assessment of Chlorophyll-a Algorithms Considering Different Trophic Statuses and Optimal Bands
Higa, Hiroto; Kobayashi, Hiroshi; Oki, Kazuo
2017-01-01
Numerous algorithms have been proposed to retrieve chlorophyll-a concentrations in Case 2 waters; however, the retrieval accuracy is far from satisfactory. In this research, seven algorithms are assessed with different band combinations of multispectral and hyperspectral bands using linear (LN), quadratic polynomial (QP) and power (PW) regression approaches, resulting in altogether 43 algorithmic combinations. These algorithms are evaluated by using simulated and measured datasets to understand the strengths and limitations of these algorithms. Two simulated datasets comprising 500,000 reflectance spectra each, both based on wide ranges of inherent optical properties (IOPs), are generated for the calibration and validation stages. Results reveal that the regression approach (i.e., LN, QP, and PW) has more influence on the simulated dataset than on the measured one. The algorithms that incorporated linear regression provide the highest retrieval accuracy for the simulated dataset. Results from simulated datasets reveal that the 3-band (3b) algorithm that incorporate 665-nm and 680-nm bands and band tuning selection approach outperformed other algorithms with root mean square error (RMSE) of 15.87 mg·m−3, 16.25 mg·m−3, and 19.05 mg·m−3, respectively. The spatial distribution of the best performing algorithms, for various combinations of chlorophyll-a (Chla) and non-algal particles (NAP) concentrations, show that the 3b_tuning_QP and 3b_680_QP outperform other algorithms in terms of minimum RMSE frequency of 33.19% and 60.52%, respectively. However, the two algorithms failed to accurately retrieve Chla for many combinations of Chla and NAP, particularly for low Chla and NAP concentrations. In addition, the spatial distribution emphasizes that no single algorithm can provide outstanding accuracy for Chla retrieval and that multi-algorithms should be included to reduce the error. Comparing the results of the measured and simulated datasets reveal that the algorithms that incorporate the 665-nm band outperform other algorithms for measured dataset (RMSE = 36.84 mg·m−3), while algorithms that incorporate the band tuning approach provide the highest retrieval accuracy for the simulated dataset (RMSE = 25.05 mg·m−3). PMID:28758984
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keefer, Donald A.; Shaffer, Eric G.; Storsved, Brynne
A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing jointmore » visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less
RVA: A Plugin for ParaView 3.14
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-09-04
RVA is a plugin developed for the 64-bit Windows version of the ParaView 3.14 visualization package. RVA is designed to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed onmore » enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including: sophisticated connectivity analysis, cross sections through simulation results between selected wells, simplified volumetric calculations, global vertical exaggeration adjustments, ingestion of UTChem simulation results, ingestion of Isatis geostatistical framework models, interrogation of joint geologic and reservoir modeling results, joint visualization and analysis of well history files, location-targeted visualization, advanced correlation analysis, visualization of flow paths, and creation of static images and animations highlighting targeted reservoir features.« less
Bartel, Billie J
2014-08-01
This pilot study explored the use of multidisciplinary high-fidelity simulation and additional pharmacist-focused training methods in training postgraduate year 1 (PGY1) pharmacy residents to provide Advanced Cardiovascular Life Support (ACLS) care. Pharmacy resident confidence and comfort level were assessed after completing these training requirements. The ACLS training requirements for pharmacy residents were revised to include didactic instruction on ACLS pharmacology and rhythm recognition and participation in multidisciplinary high-fidelity simulation ACLS experiences in addition to ACLS provider certification. Surveys were administered to participating residents to assess the impact of this additional education on resident confidence and comfort level in cardiopulmonary arrest situations. The new ACLS didactic and simulation training requirements resulted in increased resident confidence and comfort level in all assessed functions. Residents felt more confident in all areas except providing recommendations for dosing and administration of medications and rhythm recognition after completing the simulation scenarios than with ACLS certification training and the didactic components alone. All residents felt the addition of lectures and simulation experiences better prepared them to function as a pharmacist in the ACLS team. Additional ACLS training requirements for pharmacy residents increased overall awareness of pharmacist roles and responsibilities and greatly improved resident confidence and comfort level in performing most essential pharmacist functions during ACLS situations. © The Author(s) 2013.
Simulator Studies of the Deep Stall
NASA Technical Reports Server (NTRS)
White, Maurice D.; Cooper, George E.
1965-01-01
Simulator studies of the deep-stall problem encountered with modern airplanes are discussed. The results indicate that the basic deep-stall tendencies produced by aerodynamic characteristics are augmented by operational considerations. Because of control difficulties to be anticipated in the deep stall, it is desirable that adequate safeguards be provided against inadvertent penetrations.
FERN - a Java framework for stochastic simulation and evaluation of reaction networks.
Erhard, Florian; Friedel, Caroline C; Zimmer, Ralf
2008-08-29
Stochastic simulation can be used to illustrate the development of biological systems over time and the stochastic nature of these processes. Currently available programs for stochastic simulation, however, are limited in that they either a) do not provide the most efficient simulation algorithms and are difficult to extend, b) cannot be easily integrated into other applications or c) do not allow to monitor and intervene during the simulation process in an easy and intuitive way. Thus, in order to use stochastic simulation in innovative high-level modeling and analysis approaches more flexible tools are necessary. In this article, we present FERN (Framework for Evaluation of Reaction Networks), a Java framework for the efficient simulation of chemical reaction networks. FERN is subdivided into three layers for network representation, simulation and visualization of the simulation results each of which can be easily extended. It provides efficient and accurate state-of-the-art stochastic simulation algorithms for well-mixed chemical systems and a powerful observer system, which makes it possible to track and control the simulation progress on every level. To illustrate how FERN can be easily integrated into other systems biology applications, plugins to Cytoscape and CellDesigner are included. These plugins make it possible to run simulations and to observe the simulation progress in a reaction network in real-time from within the Cytoscape or CellDesigner environment. FERN addresses shortcomings of currently available stochastic simulation programs in several ways. First, it provides a broad range of efficient and accurate algorithms both for exact and approximate stochastic simulation and a simple interface for extending to new algorithms. FERN's implementations are considerably faster than the C implementations of gillespie2 or the Java implementations of ISBJava. Second, it can be used in a straightforward way both as a stand-alone program and within new systems biology applications. Finally, complex scenarios requiring intervention during the simulation progress can be modelled easily with FERN.
Medley, John B
2016-05-01
One of the most important mandates of physical joint simulators is to provide test results that allow the implant manufacturer to anticipate and perhaps avoid clinical wear problems with their new products. This is best done before market release. This study gives four steps to follow in conducting such wear simulator testing. Two major examples involving hip wear simulators are discussed in which attempts had been made to predict clinical wear performance prior to market release. The second one, involving the DePuy ASR implant systems, is chosen for more extensive treatment by making it an illustrative example to explore whether wear simulator testing can anticipate clinical wear problems. It is concluded that hip wear simulator testing did provide data in the academic literature that indicated some risk of clinical wear problems prior to market release of the ASR implant systems. This supports the idea that physical joint simulators have an important role in the pre-market testing of new joint replacement implants. © IMechE 2016.
Simulations of DNA stretching by flow field in microchannels with complex geometry.
Huang, Chiou-De; Kang, Dun-Yen; Hsieh, Chih-Chen
2014-01-01
Recently, we have reported the experimental results of DNA stretching by flow field in three microchannels (C. H. Lee and C. C. Hsieh, Biomicrofluidics 7(1), 014109 (2013)) designed specifically for the purpose of preconditioning DNA conformation for easier stretching. The experimental results do not only demonstrate the superiority of the new devices but also provides detailed observation of DNA behavior in complex flow field that was not available before. In this study, we use Brownian dynamics-finite element method (BD-FEM) to simulate DNA behavior in these microchannels, and compare the results against the experiments. Although the hydrodynamic interaction (HI) between DNA segments and between DNA and the device boundaries was not included in the simulations, the simulation results are in fairly good agreement with the experimental data from either the aspect of the single molecule behavior or from the aspect of ensemble averaged properties. The discrepancy between the simulation and the experimental results can be explained by the neglect of HI effect in the simulations. Considering the huge savings on the computational cost from neglecting HI, we conclude that BD-FEM can be used as an efficient and economic designing tool for developing new microfluidic device for DNA manipulation.
Dependability analysis of parallel systems using a simulation-based approach. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sawyer, Darren Charles
1994-01-01
The analysis of dependability in large, complex, parallel systems executing real applications or workloads is examined in this thesis. To effectively demonstrate the wide range of dependability problems that can be analyzed through simulation, the analysis of three case studies is presented. For each case, the organization of the simulation model used is outlined, and the results from simulated fault injection experiments are explained, showing the usefulness of this method in dependability modeling of large parallel systems. The simulation models are constructed using DEPEND and C++. Where possible, methods to increase dependability are derived from the experimental results. Another interesting facet of all three cases is the presence of some kind of workload of application executing in the simulation while faults are injected. This provides a completely new dimension to this type of study, not possible to model accurately with analytical approaches.
OSCAR a Matlab based optical FFT code
NASA Astrophysics Data System (ADS)
Degallaix, Jérôme
2010-05-01
Optical simulation softwares are essential tools for designing and commissioning laser interferometers. This article aims to introduce OSCAR, a Matlab based FFT code, to the experimentalist community. OSCAR (Optical Simulation Containing Ansys Results) is used to simulate the steady state electric fields in optical cavities with realistic mirrors. The main advantage of OSCAR over other similar packages is the simplicity of its code requiring only a short time to master. As a result, even for a beginner, it is relatively easy to modify OSCAR to suit other specific purposes. OSCAR includes an extensive manual and numerous detailed examples such as simulating thermal aberration, calculating cavity eigen modes and diffraction loss, simulating flat beam cavities and three mirror ring cavities. An example is also provided about how to run OSCAR on the GPU of modern graphic cards instead of the CPU, making the simulation up to 20 times faster.
Design of teleoperation system with a force-reflecting real-time simulator
NASA Technical Reports Server (NTRS)
Hirata, Mitsunori; Sato, Yuichi; Nagashima, Fumio; Maruyama, Tsugito
1994-01-01
We developed a force-reflecting teleoperation system that uses a real-time graphic simulator. This system eliminates the effects of communication time delays in remote robot manipulation. The simulator provides the operator with predictive display and feedback of computed contact forces through a six-degree of freedom (6-DOF) master arm on a real-time basis. With this system, peg-in-hole tasks involving round-trip communication time delays of up to a few seconds were performed at three support levels: a real image alone, a predictive display with a real image, and a real-time graphic simulator with computed-contact-force reflection and a predictive display. The experimental results indicate the best teleoperation efficiency was achieved by using the force-reflecting simulator with two images. The shortest work time, lowest sensor maximum, and a 100 percent success rate were obtained. These results demonstrate the effectiveness of simulated-force-reflecting teleoperation efficiency.
NASA Astrophysics Data System (ADS)
Venkataraman, Ajey; Shade, Paul A.; Adebisi, R.; Sathish, S.; Pilchak, Adam L.; Viswanathan, G. Babu; Brandes, Matt C.; Mills, Michael J.; Sangid, Michael D.
2017-05-01
Ti-7Al is a good model material for mimicking the α phase response of near- α and α+ β phases of many widely used titanium-based engineering alloys, including Ti-6Al-4V. In this study, three model structures of Ti-7Al are investigated using atomistic simulations by varying the Ti and Al atom positions within the crystalline lattice. These atomic arrangements are based on transmission electron microscopy observations of short-range order. The elastic constants of the three model structures considered are calculated using molecular dynamics simulations. Resonant ultrasound spectroscopy experiments are conducted to obtain the elastic constants at room temperature and a good agreement is found between the simulation and experimental results, providing confidence that the model structures are reasonable. Additionally, energy barriers for crystalline slip are established for these structures by means of calculating the γ-surfaces for different slip systems. Finally, the positions of Al atoms in regards to solid solution strengthening are studied using density functional theory simulations, which demonstrate a higher energy barrier for slip when the Al solute atom is closer to (or at) the fault plane. These results provide quantitative insights into the deformation mechanisms of this alloy.
Discrete event simulation: the preferred technique for health economic evaluations?
Caro, Jaime J; Möller, Jörgen; Getsios, Denis
2010-12-01
To argue that discrete event simulation should be preferred to cohort Markov models for economic evaluations in health care. The basis for the modeling techniques is reviewed. For many health-care decisions, existing data are insufficient to fully inform them, necessitating the use of modeling to estimate the consequences that are relevant to decision-makers. These models must reflect what is known about the problem at a level of detail sufficient to inform the questions. Oversimplification will result in estimates that are not only inaccurate, but potentially misleading. Markov cohort models, though currently popular, have so many limitations and inherent assumptions that they are inadequate to inform most health-care decisions. An event-based individual simulation offers an alternative much better suited to the problem. A properly designed discrete event simulation provides more accurate, relevant estimates without being computationally prohibitive. It does require more data and may be a challenge to convey transparently, but these are necessary trade-offs to provide meaningful and valid results. In our opinion, discrete event simulation should be the preferred technique for health economic evaluations today. © 2010, International Society for Pharmacoeconomics and Outcomes Research (ISPOR).
NASA Technical Reports Server (NTRS)
Frank, Andreas O.; Twombly, I. Alexander; Barth, Timothy J.; Smith, Jeffrey D.; Dalton, Bonnie P. (Technical Monitor)
2001-01-01
We have applied the linear elastic finite element method to compute haptic force feedback and domain deformations of soft tissue models for use in virtual reality simulators. Our results show that, for virtual object models of high-resolution 3D data (>10,000 nodes), haptic real time computations (>500 Hz) are not currently possible using traditional methods. Current research efforts are focused in the following areas: 1) efficient implementation of fully adaptive multi-resolution methods and 2) multi-resolution methods with specialized basis functions to capture the singularity at the haptic interface (point loading). To achieve real time computations, we propose parallel processing of a Jacobi preconditioned conjugate gradient method applied to a reduced system of equations resulting from surface domain decomposition. This can effectively be achieved using reconfigurable computing systems such as field programmable gate arrays (FPGA), thereby providing a flexible solution that allows for new FPGA implementations as improved algorithms become available. The resulting soft tissue simulation system would meet NASA Virtual Glovebox requirements and, at the same time, provide a generalized simulation engine for any immersive environment application, such as biomedical/surgical procedures or interactive scientific applications.
Rigorous analysis of an electric-field-driven liquid crystal lens for 3D displays
NASA Astrophysics Data System (ADS)
Kim, Bong-Sik; Lee, Seung-Chul; Park, Woo-Sang
2014-08-01
We numerically analyzed the optical performance of an electric field driven liquid crystal (ELC) lens adopted for 3-dimensional liquid crystal displays (3D-LCDs) through rigorous ray tracing. For the calculation, we first obtain the director distribution profile of the liquid crystals by using the Erickson-Leslie motional equation; then, we calculate the transmission of light through the ELC lens by using the extended Jones matrix method. The simulation was carried out for a 9view 3D-LCD with a diagonal of 17.1 inches, where the ELC lens was slanted to achieve natural stereoscopic images. The results show that each view exists separately according to the viewing position at an optimum viewing distance of 80 cm. In addition, our simulation results provide a quantitative explanation for the ghost or blurred images between views observed from a 3D-LCD with an ELC lens. The numerical simulations are also shown to be in good agreement with the experimental results. The present simulation method is expected to provide optimum design conditions for obtaining natural 3D images by rigorously analyzing the optical functionalities of an ELC lens.
Booth, Jonathan; Vazquez, Saulo; Martinez-Nunez, Emilio; Marks, Alison; Rodgers, Jeff; Glowacki, David R; Shalashilin, Dmitrii V
2014-08-06
In this paper, we briefly review the boxed molecular dynamics (BXD) method which allows analysis of thermodynamics and kinetics in complicated molecular systems. BXD is a multiscale technique, in which thermodynamics and long-time dynamics are recovered from a set of short-time simulations. In this paper, we review previous applications of BXD to peptide cyclization, solution phase organic reaction dynamics and desorption of ions from self-assembled monolayers (SAMs). We also report preliminary results of simulations of diamond etching mechanisms and protein unfolding in atomic force microscopy experiments. The latter demonstrate a correlation between the protein's structural motifs and its potential of mean force. Simulations of these processes by standard molecular dynamics (MD) is typically not possible, because the experimental time scales are very long. However, BXD yields well-converged and physically meaningful results. Compared with other methods of accelerated MD, our BXD approach is very simple; it is easy to implement, and it provides an integrated approach for simultaneously obtaining both thermodynamics and kinetics. It also provides a strategy for obtaining statistically meaningful dynamical results in regions of configuration space that standard MD approaches would visit only very rarely.
O'Clock, George D
2016-08-01
Cellular engineering involves modification and control of cell properties, and requires an understanding of fundamentals and mechanisms of action for cellular derived product development. One of the keys to success in cellular engineering involves the quality and validity of results obtained from cell chemical signaling pathway assays. The accuracy of the assay data cannot be verified or assured if the effect of positive feedback, nonlinearities, and interrelationships between cell chemical signaling pathway elements are not understood, modeled, and simulated. Nonlinearities and positive feedback in the cell chemical signaling pathway can produce significant aberrations in assay data collection. Simulating the pathway can reveal potential instability problems that will affect assay results. A simulation, using an electrical analog for the coupled differential equations representing each segment of the pathway, provides an excellent tool for assay validation purposes. With this approach, voltages represent pathway enzyme concentrations and operational amplifier feedback resistance and input resistance values determine pathway gain and rate constants. The understanding provided by pathway modeling and simulation is strategically important in order to establish experimental controls for assay protocol structure, time frames specified between assays, and assay concentration variation limits; to ensure accuracy and reproducibility of results.
Schoenthaler, Antoinette; Albright, Glenn; Hibbard, Judith; Goldman, Ron
2017-04-19
Despite clear evidence that antibiotics do not cure viral infections, the problem of unnecessary prescribing of antibiotics in ambulatory care persists, and in some cases, prescribing patterns have increased. The overuse of antibiotics for treating viral infections has created numerous economic and clinical consequences including increased medical costs due to unnecessary hospitalizations, antibiotic resistance, disruption of gut bacteria, and obesity. Recent research has underscored the importance of collaborative patient-provider communication as a means to reduce the high rates of unnecessary prescriptions for antibiotics. However, most patients and providers do not feel prepared to engage in such challenging conversations. The aim of this pilot study was to assess the ability of a brief 15-min simulated role-play conversation with virtual humans to serve as a preliminary step to help health care providers and patients practice, and learn how to engage in effective conversations about antibiotics overuse. A total of 69 participants (35 providers and 34 patients) completed the simulation once in one sitting. A pre-post repeated measures design was used to assess changes in patients' and providers' self-reported communication behaviors, activation, and preparedness, intention, and confidence to effectively communicate in the patient-provider encounter. Changes in patients' knowledge and beliefs regarding antibiotic use were also evaluated. Patients experienced a short-term positive improvement in beliefs about appropriate antibiotic use for infection (F 1,30 =14.10, P=.001). Knowledge scores regarding the correct uses of antibiotics improved immediately postsimulation, but decreased at the 1-month follow-up (F 1,30 =31.16, P<.001). There was no change in patient activation and shared decision-making (SDM) scores in the total sample of patients (P>.10) Patients with lower levels of activation exhibited positive, short-term benefits in increased intent and confidence to discuss their needs and ask questions in the clinic visit, positive attitudes regarding participation in SDM with their provider, and accurate beliefs about the use of antibiotics (P<.10). The results also suggest small immediate gains in providers' attitudes about SDM (mean change 0.20; F 1,33 = 8.03, P=.01). This pilot study provided preliminary evidence on the efficacy of the use of simulated conversations with virtual humans as a tool to improve patient-provider communication (ie, through increasing patient confidence to actively participate in the visit and physician attitudes about SDM) for engaging in conversations about antibiotic use. Future research should explore if repeated opportunities to use the 15-min simulation as well as providing users with several different conversations to practice with would result in sustained improvements in antibiotics beliefs and knowledge and communication behaviors over time. The results of this pilot study offered several opportunities to improve on the simulation in order to bolster communication skills and knowledge retention. ©Antoinette Schoenthaler, Glenn Albright, Judith Hibbard, Ron Goldman. Originally published in JMIR Medical Education (http://mededu.jmir.org), 19.04.2017.
Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu
2016-01-01
The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance–performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system. PMID:27598390
Chang, Kuei-Hu; Chang, Yung-Chia; Chain, Kai; Chung, Hsiang-Yu
2016-01-01
The advancement of high technologies and the arrival of the information age have caused changes to the modern warfare. The military forces of many countries have replaced partially real training drills with training simulation systems to achieve combat readiness. However, considerable types of training simulation systems are used in military settings. In addition, differences in system set up time, functions, the environment, and the competency of system operators, as well as incomplete information have made it difficult to evaluate the performance of training simulation systems. To address the aforementioned problems, this study integrated analytic hierarchy process, soft set theory, and the fuzzy linguistic representation model to evaluate the performance of various training simulation systems. Furthermore, importance-performance analysis was adopted to examine the influence of saving costs and training safety of training simulation systems. The findings of this study are expected to facilitate applying military training simulation systems, avoiding wasting of resources (e.g., low utility and idle time), and providing data for subsequent applications and analysis. To verify the method proposed in this study, the numerical examples of the performance evaluation of training simulation systems were adopted and compared with the numerical results of an AHP and a novel AHP-based ranking technique. The results verified that not only could expert-provided questionnaire information be fully considered to lower the repetition rate of performance ranking, but a two-dimensional graph could also be used to help administrators allocate limited resources, thereby enhancing the investment benefits and training effectiveness of a training simulation system.
Mass imbalances in EPANET water-quality simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Michael J.; Janke, Robert; Taxon, Thomas N.
EPANET is widely employed to simulate water quality in water distribution systems. However, the time-driven simulation approach used to determine concentrations of water-quality constituents provides accurate results, in general, only for small water-quality time steps; use of an adequately short time step may not be feasible. Overly long time steps can yield errors in concentrations and result in situations in which constituent mass is not conserved. Mass may not be conserved even when EPANET gives no errors or warnings. This paper explains how such imbalances can occur and provides examples of such cases; it also presents a preliminary event-driven approachmore » that conserves mass with a water-quality time step that is as long as the hydraulic time step. Results obtained using the current approach converge, or tend to converge, to those obtained using the new approach as the water-quality time step decreases. Improving the water-quality routing algorithm used in EPANET could eliminate mass imbalances and related errors in estimated concentrations.« less
Derivation and Applicability of Asymptotic Results for Multiple Subtests Person-Fit Statistics
Albers, Casper J.; Meijer, Rob R.; Tendeiro, Jorge N.
2016-01-01
In high-stakes testing, it is important to check the validity of individual test scores. Although a test may, in general, result in valid test scores for most test takers, for some test takers, test scores may not provide a good description of a test taker’s proficiency level. Person-fit statistics have been proposed to check the validity of individual test scores. In this study, the theoretical asymptotic sampling distribution of two person-fit statistics that can be used for tests that consist of multiple subtests is first discussed. Second, simulation study was conducted to investigate the applicability of this asymptotic theory for tests of finite length, in which the correlation between subtests and number of items in the subtests was varied. The authors showed that these distributions provide reasonable approximations, even for tests consisting of subtests of only 10 items each. These results have practical value because researchers do not have to rely on extensive simulation studies to simulate sampling distributions. PMID:29881053
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moskvin, V; Pirlepesov, F; Tsiamas, P
Purpose: This study provides an overview of the design and commissioning of the Monte Carlo (MC) model of the spot-scanning proton therapy nozzle and its implementation for the patient plan simulation. Methods: The Hitachi PROBEAT V scanning nozzle was simulated based on vendor specifications using the TOPAS extension of Geant4 code. FLUKA MC simulation was also utilized to provide supporting data for the main simulation. Validation of the MC model was performed using vendor provided data and measurements collected during acceptance/commissioning of the proton therapy machine. Actual patient plans using CT based treatment geometry were simulated and compared to themore » dose distributions produced by the treatment planning system (Varian Eclipse 13.6), and patient quality assurance measurements. In-house MATLAB scripts are used for converting DICOM data into TOPAS input files. Results: Comparison analysis of integrated depth doses (IDDs), therapeutic ranges (R90), and spot shape/sizes at different distances from the isocenter, indicate good agreement between MC and measurements. R90 agreement is within 0.15 mm across all energy tunes. IDDs and spot shapes/sizes differences are within statistical error of simulation (less than 1.5%). The MC simulated data, validated with physical measurements, were used for the commissioning of the treatment planning system. Patient geometry simulations were conducted based on the Eclipse produced DICOM plans. Conclusion: The treatment nozzle and standard option beam model were implemented in the TOPAS framework to simulate a highly conformal discrete spot-scanning proton beam system.« less
Rahnamoun, A; van Duin, A C T
2014-04-17
Atomic oxygen (AO) is the most abundant element in the low Earth orbit (LEO). It is the result of the dissociation of molecular oxygen by ultraviolet radiation from the sun. In the LEO, it collides with the materials used on spacecraft surfaces and causes degradation of these materials. The degradation of the materials on the surface of spacecrafts at LEO has been a significant problem for a long time. Kapton polyimide, polyhedral oligomeric silsesquioxane (POSS), silica, and Teflon are the materials extensively used in spacecraft industry, and like many other materials used in spacecraft industry, AO collision degradation is an important issue in their applications on spacecrafts. To investigate the surface chemistry of these materials in exposure to space AO, a computational chemical evaluation of the Kapton polyimide, POSS, amorphous silica, and Teflon was performed in separate simulations under similar conditions. For performing these simulations, the ReaxFF reactive force-field program was used, which provides the computational speed required to perform molecular dynamics (MD) simulations on system sizes sufficiently large to describe the full chemistry of the reactions. Using these simulations, the effects of AO impact on different materials and the role of impact energies, the content of material, and temperature of material on the behavior of the materials are studied. The ReaxFF results indicate that Kapton is less resistant than Teflon toward AO damage. These results are in good agreement with experiment. These simulations indicate that the amorphous silica shows the highest stability among these materials before the start of the highly exothermic silicon oxidation. We have verified that adding silicon to the bulk of the Kapton structure enhances the stability of the Kapton against AO impact. Our canonical MD simulations demonstrate that an increase in the heat transfer in materials during AO impact can provide a considerable decrease in the disintegration of the material. This effect is especially relevant in silica AO collision. Considerable experimental efforts have been undertaken to minimize such AO-based degradations. As our simulations demonstrate, ReaxFF can provide a cost-effective screening tool for future material optimization.
Performance optimization and validation of ADM1 simulations under anaerobic thermophilic conditions.
Atallah, Nabil M; El-Fadel, Mutasem; Ghanimeh, Sophia; Saikaly, Pascal; Abou-Najm, Majdi
2014-12-01
In this study, two experimental sets of data each involving two thermophilic anaerobic digesters treating food waste, were simulated using the Anaerobic Digestion Model No. 1 (ADM1). A sensitivity analysis was conducted, using both data sets of one digester, for parameter optimization based on five measured performance indicators: methane generation, pH, acetate, total COD, ammonia, and an equally weighted combination of the five indicators. The simulation results revealed that while optimization with respect to methane alone, a commonly adopted approach, succeeded in simulating methane experimental results, it predicted other intermediary outputs less accurately. On the other hand, the multi-objective optimization has the advantage of providing better results than methane optimization despite not capturing the intermediary output. The results from the parameter optimization were validated upon their independent application on the data sets of the second digester. Copyright © 2014 Elsevier Ltd. All rights reserved.
Atomistic simulations of highly conductive molecular transport junctions under realistic conditions
NASA Astrophysics Data System (ADS)
French, William R.; Iacovella, Christopher R.; Rungger, Ivan; Souza, Amaury Melo; Sanvito, Stefano; Cummings, Peter T.
2013-04-01
We report state-of-the-art atomistic simulations combined with high-fidelity conductance calculations to probe structure-conductance relationships in Au-benzenedithiolate (BDT)-Au junctions under elongation. Our results demonstrate that large increases in conductance are associated with the formation of monatomic chains (MACs) of Au atoms directly connected to BDT. An analysis of the electronic structure of the simulated junctions reveals that enhancement in the s-like states in Au MACs causes the increases in conductance. Other structures also result in increased conductance but are too short-lived to be detected in experiment, while MACs remain stable for long simulation times. Examinations of thermally evolved junctions with and without MACs show negligible overlap between conductance histograms, indicating that the increase in conductance is related to this unique structural change and not thermal fluctuation. These results, which provide an excellent explanation for a recently observed anomalous experimental result [Bruot et al., Nat. Nanotechnol., 2012, 7, 35-40], should aid in the development of mechanically responsive molecular electronic devices.We report state-of-the-art atomistic simulations combined with high-fidelity conductance calculations to probe structure-conductance relationships in Au-benzenedithiolate (BDT)-Au junctions under elongation. Our results demonstrate that large increases in conductance are associated with the formation of monatomic chains (MACs) of Au atoms directly connected to BDT. An analysis of the electronic structure of the simulated junctions reveals that enhancement in the s-like states in Au MACs causes the increases in conductance. Other structures also result in increased conductance but are too short-lived to be detected in experiment, while MACs remain stable for long simulation times. Examinations of thermally evolved junctions with and without MACs show negligible overlap between conductance histograms, indicating that the increase in conductance is related to this unique structural change and not thermal fluctuation. These results, which provide an excellent explanation for a recently observed anomalous experimental result [Bruot et al., Nat. Nanotechnol., 2012, 7, 35-40], should aid in the development of mechanically responsive molecular electronic devices. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr00459g
A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data
Fan, Ya Ju; Kamath, Chandrika
2016-09-01
The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less
A Comparison of Compressed Sensing and Sparse Recovery Algorithms Applied to Simulation Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, Ya Ju; Kamath, Chandrika
The move toward exascale computing for scientific simulations is placing new demands on compression techniques. It is expected that the I/O system will not be able to support the volume of data that is expected to be written out. To enable quantitative analysis and scientific discovery, we are interested in techniques that compress high-dimensional simulation data and can provide perfect or near-perfect reconstruction. In this paper, we explore the use of compressed sensing (CS) techniques to reduce the size of the data before they are written out. Using large-scale simulation data, we investigate how the sufficient sparsity condition and themore » contrast in the data affect the quality of reconstruction and the degree of compression. Also, we provide suggestions for the practical implementation of CS techniques and compare them with other sparse recovery methods. Finally, our results show that despite longer times for reconstruction, compressed sensing techniques can provide near perfect reconstruction over a range of data with varying sparsity.« less
Stahnke, Amanda M.; Behnen, Erin M.
2015-01-01
Objective. To assess the impact of a 6-week patient/provider interaction simulation on empathy and self-efficacy levels of diabetes management skills in third-year pharmacy students. Design. Pharmacy students enrolled in a diabetes elective course were paired to act as a patient with diabetes or as a provider assisting in the management of that patient during a 6-week simulation activity. After 3 weeks, students switched roles. The simulation was designed with activities to build empathy. Assessment. The Jefferson Scale of Empathy (JSE) and a self-efficacy survey were administered to assess change in empathy and confidence levels from baseline to the end of the activity. Completion of the activity resulted in significant improvement in total JSE scores. Additionally, significant improvements in overall self-efficacy scores regarding diabetes management were noted. Conclusion. The 6-week patient/provider interaction simulation improved empathy and self-efficacy levels in third-year pharmacy students. PMID:25995517
Training students to detect delirium: An interprofessional pilot study.
Chambers, Breah; Meyer, Mary; Peterson, Moya
2018-06-01
The purpose of this paper is to report nursing student knowledge acquisition and attitude after completing and interprofessional simulation with medical students. The IOM has challenged healthcare educators to teach teamwork and communication skills in interprofessional settings. Interprofessional simulation provides a higher fidelity experience than simulation in silos. Simulation may be particularly useful in helping healthcare workers gain the necessary skills to care for psychiatric clients. Specifically, healthcare providers have difficulty differentiating between dementia and delirium. Recognizing this deficit, an interprofessional simulation was created using medical students in their neurology rotation and senior nursing students. Twenty-four volunteer nursing students completed a pre-survey to assess delirium knowledge and then completed an education module about delirium. Twelve of these students participated in a simulation with medicine students. Pre and Post Kid SIM Attitude questionnaires were completed by all students participating in the simulation. After the simulations were complete, all twenty-four students were asked to complete the post-survey regarding delirium knowledge. While delirium knowledge scores improved in both groups, the simulation group scored higher, but the difference did not reach significance. The simulation group demonstrated a statistically significant improvement in attitudes toward simulation, interprofessional education, and teamwork post simulation compared to their pre-simulation scores. Nursing students who participated in an interprofessional simulation developed a heightened appreciation for learning communication, teamwork, situational awareness, and interprofessional roles and responsibilities. These results support the use of interprofessional simulation in healthcare education. Copyright © 2018 Elsevier Ltd. All rights reserved.
Knowledge-based simulation using object-oriented programming
NASA Technical Reports Server (NTRS)
Sidoran, Karen M.
1993-01-01
Simulations have become a powerful mechanism for understanding and modeling complex phenomena. Their results have had substantial impact on a broad range of decisions in the military, government, and industry. Because of this, new techniques are continually being explored and developed to make them even more useful, understandable, extendable, and efficient. One such area of research is the application of the knowledge-based methods of artificial intelligence (AI) to the computer simulation field. The goal of knowledge-based simulation is to facilitate building simulations of greatly increased power and comprehensibility by making use of deeper knowledge about the behavior of the simulated world. One technique for representing and manipulating knowledge that has been enhanced by the AI community is object-oriented programming. Using this technique, the entities of a discrete-event simulation can be viewed as objects in an object-oriented formulation. Knowledge can be factual (i.e., attributes of an entity) or behavioral (i.e., how the entity is to behave in certain circumstances). Rome Laboratory's Advanced Simulation Environment (RASE) was developed as a research vehicle to provide an enhanced simulation development environment for building more intelligent, interactive, flexible, and realistic simulations. This capability will support current and future battle management research and provide a test of the object-oriented paradigm for use in large scale military applications.
Chaudhari, Mangesh I.; You, Xinli; Pratt, Lawrence R.; ...
2015-11-24
Ethylene carbonate (EC) and propylene carbonate (PC) are widely used solvents in lithium (Li)-ion batteries and supercapacitors. Ion dissolution and diffusion in those media are correlated with solvent dielectric responses. Here, we use all-atom molecular dynamics simulations of the pure solvents to calculate dielectric constants and relaxation times, and molecular mobilities. The computed results are compared with limited available experiments to assist more exhaustive studies of these important characteristics. As a result, the observed agreement is encouraging and provides guidance for further validation of force-field simulation models for EC and PC solvents.
A Tool for Parameter-space Explorations
NASA Astrophysics Data System (ADS)
Murase, Yohsuke; Uchitane, Takeshi; Ito, Nobuyasu
A software for managing simulation jobs and results, named "OACIS", is presented. It controls a large number of simulation jobs executed in various remote servers, keeps these results in an organized way, and manages the analyses on these results. The software has a web browser front end, and users can submit various jobs to appropriate remote hosts from a web browser easily. After these jobs are finished, all the result files are automatically downloaded from the computational hosts and stored in a traceable way together with the logs of the date, host, and elapsed time of the jobs. Some visualization functions are also provided so that users can easily grasp the overview of the results distributed in a high-dimensional parameter space. Thus, OACIS is especially beneficial for the complex simulation models having many parameters for which a lot of parameter searches are required. By using API of OACIS, it is easy to write a code that automates parameter selection depending on the previous simulation results. A few examples of the automated parameter selection are also demonstrated.
NASA Astrophysics Data System (ADS)
Milas, Vasilis; Koletta, Maria; Constantinou, Philip
2003-07-01
This paper provides the results of interference and compatibility studies in order to assess the sharing conditions between Fixed Satellite Service (FSS) and Fixed Service provided by High Altitude Platform Stations (HAPS) in the same operational frequency bands and discusses the most important operational parameters that have an impact on the interference calculations. To characterize interference phenomena between the two systems carrier to interference (C/I) ratios are evaluated. Simulation results under the scenario of a realistic deployment of HAPS and the use of different satellite configurations are presented. An interesting result derived from the simulations is that FSS/GSO Earth Stations and HAPS ground stations may coexist in the HAPS coverage area under certain considerations.
Hot zero power reactor calculations using the Insilico code
Hamilton, Steven P.; Evans, Thomas M.; Davidson, Gregory G.; ...
2016-03-18
In this paper we describe the reactor physics simulation capabilities of the insilico code. A description of the various capabilities of the code is provided, including detailed discussion of the geometry, meshing, cross section processing, and neutron transport options. Numerical results demonstrate that the insilico SP N solver with pin-homogenized cross section generation is capable of delivering highly accurate full-core simulation of various PWR problems. Comparison to both Monte Carlo calculations and measured plant data is provided.
NASA Astrophysics Data System (ADS)
McMullen, Timothy; Liyanage, Nilanga; Xiong, Weizhi; Zhao, Zhiwen
2017-01-01
Our research has focused on simulating the response of a Gas Electron Multiplier (GEM) detector using computational methods. GEM detectors provide a cost effective solution for radiation detection in high rate environments. A detailed simulation of GEM detector response to radiation is essential for the successful adaption of these detectors to different applications. Using Geant4 Monte Carlo (GEMC), a wrapper around Geant4 which has been successfully used to simulate the Solenoidal Large Intensity Device (SoLID) at Jefferson Lab, we are developing a simulation of a GEM chamber similar to the detectors currently used in our lab. We are also refining an object-oriented digitization program, which translates energy deposition information from GEMC into electronic readout which resembles the readout from our physical detectors. We have run the simulation with beta particles produced by the simulated decay of a 90Sr source, as well as with a simulated bremsstrahlung spectrum. Comparing the simulation data with real GEM data taken under similar conditions is used to refine the simulation parameters. Comparisons between results from the simulations and results from detector tests will be presented.
A thermal vacuum-UV solar simulator test system for assessing microbiological viability
NASA Technical Reports Server (NTRS)
Ross, D. S.; Wardle, M. D.; Taylor, D. M.
1975-01-01
Microorganisms were exposed to a simulated space environment in order to assess the photobiological effect of broad spectrum, nonionizing solar electromagnetic radiation in terms of viability. A thermal vacuum chamber capable of maintaining a vacuum of 0.000133n/sq m and an ultraviolet rich solar simulator were the main ingredients of the test system. Results to date indicate the system to be capable of providing reliable microbiological data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sparn, Bethany F; Ruth, Mark F; Krishnamurthy, Dheepak
Many have proposed that responsive load provided by distributed energy resources (DERs) and demand response (DR) are an option to provide flexibility to the grid and especially to distribution feeders. However, because responsive load involves a complex interplay between tariffs and DER and DR technologies, it is challenging to test and evaluate options without negatively impacting customers. This paper describes a hardware-in-the-loop (HIL) simulation system that has been developed to reduce the cost of evaluating the impact of advanced controllers (e.g., model predictive controllers) and technologies (e.g., responsive appliances). The HIL simulation system combines large-scale software simulation with a smallmore » set of representative building equipment hardware. It is used to perform HIL simulation of a distribution feeder and the loads on it under various tariff structures. In the reported HIL simulation, loads include many simulated air conditioners and one physical air conditioner. Independent model predictive controllers manage operations of all air conditioners under a time-of-use tariff. Results from this HIL simulation and a discussion of future development work of the system are presented.« less
Use of Airborne Hyperspectral Data in the Simulation of Satellite Images
NASA Astrophysics Data System (ADS)
de Miguel, Eduardo; Jimenez, Marcos; Ruiz, Elena; Salido, Elena; Gutierrez de la Camara, Oscar
2016-08-01
The simulation of future images is part of the development phase of most Earth Observation missions. This simulation uses frequently as starting point images acquired from airborne instruments. These instruments provide the required flexibility in acquisition parameters (time, date, illumination and observation geometry...) and high spectral and spatial resolution, well above the target values (as required by simulation tools). However, there are a number of important problems hampering the use of airborne imagery. One of these problems is that observation zenith angles (OZA), are far from those that the misisons to be simulated would use.We examine this problem by evaluating the difference in ground reflectance estimated from airborne images for different observation/illumination geometries. Next, we analyze a solution for simulation purposes, in which a Bi- directional Reflectance Distribution Function (BRDF) model is attached to an image of the isotropic surface reflectance. The results obtained confirm the need for reflectance anisotropy correction when using airborne images for creating a reflectance map for simulation purposes. But this correction should not be used without providing the corresponding estimation of BRDF, in the form of model parameters, to the simulation teams.
Stochastic locality and master-field simulations of very large lattices
NASA Astrophysics Data System (ADS)
Lüscher, Martin
2018-03-01
In lattice QCD and other field theories with a mass gap, the field variables in distant regions of a physically large lattice are only weakly correlated. Accurate stochastic estimates of the expectation values of local observables may therefore be obtained from a single representative field. Such master-field simulations potentially allow very large lattices to be simulated, but require various conceptual and technical issues to be addressed. In this talk, an introduction to the subject is provided and some encouraging results of master-field simulations of the SU(3) gauge theory are reported.
Development of a technique for inflight jet noise simulation. I, II
NASA Technical Reports Server (NTRS)
Clapper, W. S.; Stringas, E. J.; Mani, R.; Banerian, G.
1976-01-01
Several possible noise simulation techniques were evaluated, including closed circuit wind tunnels, free jets, rocket sleds and high speed trains. The free jet technique was selected for demonstration and verification. The first paper describes the selection and development of the technique and presents results for simulation and in-flight tests of the Learjet, F106, and Bertin Aerotrain. The second presents a theoretical study relating the two sets of noise signatures. It is concluded that the free jet simulation technique provides a satisfactory assessment of in-flight noise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimminau, G; Nagler, B; Higginbotham, A
2008-06-19
Calculations of the x-ray diffraction patterns from shocked crystals derived from the results of Non-Equilibrium-Molecular-Dynamics (NEMD) simulations are presented. The atomic coordinates predicted by the NEMD simulations combined with atomic form factors are used to generate a discrete distribution of electron density. A Fast-Fourier-Transform (FFT) of this distribution provides an image of the crystal in reciprocal space, which can be further processed to produce quantitative simulated data for direct comparison with experiments that employ picosecond x-ray diffraction from laser-irradiated crystalline targets.
A generalized transport-velocity formulation for smoothed particle hydrodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chi; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de; Adams, Nikolaus A.
The standard smoothed particle hydrodynamics (SPH) method suffers from tensile instability. In fluid-dynamics simulations this instability leads to particle clumping and void regions when negative pressure occurs. In solid-dynamics simulations, it results in unphysical structure fragmentation. In this work the transport-velocity formulation of Adami et al. (2013) is generalized for providing a solution of this long-standing problem. Other than imposing a global background pressure, a variable background pressure is used to modify the particle transport velocity and eliminate the tensile instability completely. Furthermore, such a modification is localized by defining a shortened smoothing length. The generalized formulation is suitable formore » fluid and solid materials with and without free surfaces. The results of extensive numerical tests on both fluid and solid dynamics problems indicate that the new method provides a unified approach for multi-physics SPH simulations.« less
Aircraft Flight Modeling During the Optimization of Gas Turbine Engine Working Process
NASA Astrophysics Data System (ADS)
Tkachenko, A. Yu; Kuz'michev, V. S.; Krupenich, I. N.
2018-01-01
The article describes a method for simulating the flight of the aircraft along a predetermined path, establishing a functional connection between the parameters of the working process of gas turbine engine and the efficiency criteria of the aircraft. This connection is necessary for solving the optimization tasks of the conceptual design stage of the engine according to the systems approach. Engine thrust level, in turn, influences the operation of aircraft, thus making accurate simulation of the aircraft behavior during flight necessary for obtaining the correct solution. The described mathematical model of aircraft flight provides the functional connection between the airframe characteristics, working process of gas turbine engines (propulsion system), ambient and flight conditions and flight profile features. This model provides accurate results of flight simulation and the resulting aircraft efficiency criteria, required for optimization of working process and control function of a gas turbine engine.
Numerical Investigations of Moisture Distribution in a Selected Anisotropic Soil Medium
NASA Astrophysics Data System (ADS)
Iwanek, M.
2018-01-01
The moisture of soil profile changes both in time and space and depends on many factors. Changes of the quantity of water in soil can be determined on the basis of in situ measurements, but numerical methods are increasingly used for this purpose. The quality of the results obtained using pertinent software packages depends on appropriate description and parameterization of soil medium. Thus, the issue of providing for the soil anisotropy phenomenon gains a big importance. Although anisotropy can be taken into account in many numerical models, isotopic soil is often assumed in the research process. However, this assumption can be a reason for incorrect results in the simulations of water changes in soil medium. In this article, results of numerical simulations of moisture distribution in the selected soil profile were presented. The calculations were conducted assuming isotropic and anisotropic conditions. Empirical verification of the results obtained in the numerical investigations indicated statistical essential discrepancies for the both analyzed conditions. However, better fitting measured and calculated moisture values was obtained for the case of providing for anisotropy in the simulation model.
Modification of Obstetric Emergency Simulation Scenarios for Realism in a Home-Birth Setting.
Komorowski, Janelle; Andrighetti, Tia; Benton, Melissa
2017-01-01
Clinical competency and clear communication are essential for intrapartum care providers who encounter high-stakes, low-frequency emergencies. The challenge for these providers is to maintain infrequently used skills. The challenge is even more significant for midwives who manage births at home and who, due to low practice volume and low-risk clientele, may rarely encounter an emergency. In addition, access to team simulation may be limited for home-birth midwives. This project modified existing validated obstetric simulation scenarios for a home-birth setting. Twelve certified professional midwives (CPMs) in active home-birth practice participated in shoulder dystocia and postpartum hemorrhage simulations. The simulations were staged to resemble home-birth settings, supplies, and personnel. Fidelity (realism) of the simulations was assessed with the Simulation Design Scale, and satisfaction and self-confidence were assessed with the Student Satisfaction and Self-Confidence in Learning Scale. Both utilized a 5-point Likert scale, with higher scores suggesting greater levels of fidelity, participant satisfaction, and self-confidence. Simulation Design Scale scores indicated participants agreed fidelity was achieved for the home-birth setting, while scores on the Student Satisfaction and Self-Confidence in Learning indicated high levels of participant satisfaction and self-confidence. If offered without modification, simulation scenarios designed for use in hospitals may lose fidelity for home-birth midwives, particularly in the environmental and psychological components. Simulation is standard of care in most settings, an excellent vehicle for maintaining skills, and some evidence suggests it results in improved perinatal outcomes. Additional study is needed in this area to support home-birth providers in maintaining skills. This pilot study suggests that simulation scenarios intended for hospital use can be successfully adapted to the home-birth setting. © 2016 by the American College of Nurse-Midwives.
Two-Dimensional Neutronic and Fuel Cycle Analysis of the Transatomic Power Molten Salt Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Betzler, Benjamin R.; Powers, Jeffrey J.; Worrall, Andrew
2017-01-15
This status report presents the results from the first phase of the collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear, Nuclear Energy Voucher program. The TAP design is a molten salt reactor using movable moderator rods to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches andmore » time-dependent parameters necessary to simulate the continuously changing physics in this complex system. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this design. Additional analyses of time step sizes, mass feed rates and enrichments, and isotopic removals provide additional information to make informed design decisions. This work further demonstrates capabilities of ORNL modeling and simulation tools for analysis of molten salt reactor designs and strongly positions this effort for the upcoming three-dimensional core analysis.« less
YARNsim: Simulating Hadoop YARN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Ning; Yang, Xi; Sun, Xian-He
Despite the popularity of the Apache Hadoop system, its success has been limited by issues such as single points of failure, centralized job/task management, and lack of support for programming models other than MapReduce. The next generation of Hadoop, Apache Hadoop YARN, is designed to address these issues. In this paper, we propose YARNsim, a simulation system for Hadoop YARN. YARNsim is based on parallel discrete event simulation and provides protocol-level accuracy in simulating key components of YARN. YARNsim provides a virtual platform on which system architects can evaluate the design and implementation of Hadoop YARN systems. Also, application developersmore » can tune job performance and understand the tradeoffs between different configurations, and Hadoop YARN system vendors can evaluate system efficiency under limited budgets. To demonstrate the validity of YARNsim, we use it to model two real systems and compare the experimental results from YARNsim and the real systems. The experiments include standard Hadoop benchmarks, synthetic workloads, and a bioinformatics application. The results show that the error rate is within 10% for the majority of test cases. The experiments prove that YARNsim can provide what-if analysis for system designers in a timely manner and at minimal cost compared with testing and evaluating on a real system.« less
Airfoil Ice-Accretion Aerodynamics Simulation
NASA Technical Reports Server (NTRS)
Bragg, Michael B.; Broeren, Andy P.; Addy, Harold E.; Potapczuk, Mark G.; Guffond, Didier; Montreuil, E.
2007-01-01
NASA Glenn Research Center, ONERA, and the University of Illinois are conducting a major research program whose goal is to improve our understanding of the aerodynamic scaling of ice accretions on airfoils. The program when it is completed will result in validated scaled simulation methods that produce the essential aerodynamic features of the full-scale iced-airfoil. This research will provide some of the first, high-fidelity, full-scale, iced-airfoil aerodynamic data. An initial study classified ice accretions based on their aerodynamics into four types: roughness, streamwise ice, horn ice, and spanwise-ridge ice. Subscale testing using a NACA 23012 airfoil was performed in the NASA IRT and University of Illinois wind tunnel to better understand the aerodynamics of these ice types and to test various levels of ice simulation fidelity. These studies are briefly reviewed here and have been presented in more detail in other papers. Based on these results, full-scale testing at the ONERA F1 tunnel using cast ice shapes obtained from molds taken in the IRT will provide full-scale iced airfoil data from full-scale ice accretions. Using these data as a baseline, the final step is to validate the simulation methods in scale in the Illinois wind tunnel. Computational ice accretion methods including LEWICE and ONICE have been used to guide the experiments and are briefly described and results shown. When full-scale and simulation aerodynamic results are available, these data will be used to further develop computational tools. Thus the purpose of the paper is to present an overview of the program and key results to date.
Comparison of simulator fidelity model predictions with in-simulator evaluation data
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Mckissick, B. T.; Ashworth, B. R.
1983-01-01
A full factorial in simulator experiment of a single axis, multiloop, compensatory pitch tracking task is described. The experiment was conducted to provide data to validate extensions to an analytic, closed loop model of a real time digital simulation facility. The results of the experiment encompassing various simulation fidelity factors, such as visual delay, digital integration algorithms, computer iteration rates, control loading bandwidths and proprioceptive cues, and g-seat kinesthetic cues, are compared with predictions obtained from the analytic model incorporating an optimal control model of the human pilot. The in-simulator results demonstrate more sensitivity to the g-seat and to the control loader conditions than were predicted by the model. However, the model predictions are generally upheld, although the predicted magnitudes of the states and of the error terms are sometimes off considerably. Of particular concern is the large sensitivity difference for one control loader condition, as well as the model/in-simulator mismatch in the magnitude of the plant states when the other states match.
A Novel Low-Ringing Monocycle Picosecond Pulse Generator Based on Step Recovery Diode
Zhou, Jianming; Yang, Xiao; Lu, Qiuyuan; Liu, Fan
2015-01-01
This paper presents a high-performance low-ringing ultra-wideband monocycle picosecond pulse generator, formed using a step recovery diode (SRD), simulated in ADS software and generated through experimentation. The pulse generator comprises three parts, a step recovery diode, a field-effect transistor and a Schottky diode, used to eliminate the positive and negative ringing of pulse. Simulated results validate the design. Measured results indicate an output waveform of 1.88 peak-to-peak amplitude and 307ps pulse duration with a minimal ringing of -22.5 dB, providing good symmetry and low level of ringing. A high degree of coordination between the simulated and measured results is achieved. PMID:26308450
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doubrawa Moreira, Paula; Annoni, Jennifer; Jonkman, Jason
FAST.Farm is a medium-delity wind farm modeling tool that can be used to assess power and loads contributions of wind turbines in a wind farm. The objective of this paper is to undertake a calibration procedure to set the user parameters of FAST.Farm to accurately represent results from large-eddy simulations. The results provide an in- depth analysis of the comparison of FAST.Farm and large-eddy simulations before and after calibration. The comparison of FAST.Farm and large-eddy simulation results are presented with respect to streamwise and radial velocity components as well as wake-meandering statistics (mean and standard deviation) in the lateral andmore » vertical directions under different atmospheric and turbine operating conditions.« less
A new algorithm for modeling friction in dynamic mechanical systems
NASA Technical Reports Server (NTRS)
Hill, R. E.
1988-01-01
A method of modeling friction forces that impede the motion of parts of dynamic mechanical systems is described. Conventional methods in which the friction effect is assumed a constant force, or torque, in a direction opposite to the relative motion, are applicable only to those cases where applied forces are large in comparison to the friction, and where there is little interest in system behavior close to the times of transitions through zero velocity. An algorithm is described that provides accurate determination of friction forces over a wide range of applied force and velocity conditions. The method avoids the simulation errors resulting from a finite integration interval used in connection with a conventional friction model, as is the case in many digital computer-based simulations. The algorithm incorporates a predictive calculation based on initial conditions of motion, externally applied forces, inertia, and integration step size. The predictive calculation in connection with an external integration process provides an accurate determination of both static and Coulomb friction forces and resulting motions in dynamic simulations. Accuracy of the results is improved over that obtained with conventional methods and a relatively large integration step size is permitted. A function block for incorporation in a specific simulation program is described. The general form of the algorithm facilitates implementation with various programming languages such as FORTRAN or C, as well as with other simulation programs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Signe K.; Purohit, Sumit; Boyd, Lauren W.
The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less
Ortiz, Roderick F.; Galloway, Joel M.; Miller, Lisa D.; Mau, David P.
2008-01-01
Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Bureau of Reclamation is working to meet its goal to issue a Final Environmental Impact Statement (EIS) on the Southern Delivery System project (SDS). SDS is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various EIS alternatives and plans by Pueblo West to discharge treated water into the reservoir. Plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (water years 2000 through 2002) were compared to the No Action scenario (projected demands in 2046) to assess changes in water quality over time. All scenario modeling used an external nutrient-decay model to simulate degradation and assimilation of nutrients along the riverine reach upstream from Pueblo Reservoir. Reservoir modeling was conducted using the U.S. Army Corps of Engineers CE-QUAL-W2 two-dimensional water-quality model. Lake hydrodynamics, water temperature, dissolved oxygen, dissolved solids, dissolved ammonia, dissolved nitrate, total phosphorus, algal biomass, and total iron were simulated. Two reservoir site locations were selected for comparison. Results of simulations at site 3B were characteristic of a riverine environment in the reservoir while results at site 7B (near the dam) were characteristic of the main body of the reservoir. Simulation results for the epilimnion and hypolimnion at these two sites also were evaluated and compared. The simulation results in the hypolimnion at site 7B were indicative of the water quality leaving the reservoir. Comparisons of the different scenario results were conducted to assess if substantial differences were observed between selected scenarios. Each of the scenarios was simulated for three contiguous years representing a wet, average, and dry annual hydrologic cycle (water years 2000 through 2002). Additionally, each selected simulation scenario was evaluated for differences in direct- and cumulative-effects on a particular scenario. Direct effects are intended to isolate the future effects of the scenarios. Cumulative effects are intended to evaluate the effects of the scenarios in conjunction with all reasonably foreseeable future activities in the study area. Comparisons between the direct- and cumulative-effects analyses indicated that there were not large differences in the results between most of the simulation scenarios and, as such, the focus of this report was on results for the direct-effects analysis. Addi
Practical Unitary Simulator for Non-Markovian Complex Processes
NASA Astrophysics Data System (ADS)
Binder, Felix C.; Thompson, Jayne; Gu, Mile
2018-06-01
Stochastic processes are as ubiquitous throughout the quantitative sciences as they are notorious for being difficult to simulate and predict. In this Letter, we propose a unitary quantum simulator for discrete-time stochastic processes which requires less internal memory than any classical analogue throughout the simulation. The simulator's internal memory requirements equal those of the best previous quantum models. However, in contrast to previous models, it only requires a (small) finite-dimensional Hilbert space. Moreover, since the simulator operates unitarily throughout, it avoids any unnecessary information loss. We provide a stepwise construction for simulators for a large class of stochastic processes hence directly opening the possibility for experimental implementations with current platforms for quantum computation. The results are illustrated for an example process.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
Numerical modeling and experimental validation of thermoplastic composites induction welding
NASA Astrophysics Data System (ADS)
Palmieri, Barbara; Nele, Luigi; Galise, Francesco
2018-05-01
In this work, a numerical simulation and experimental test of the induction welding of continuous fibre-reinforced thermoplastic composites (CFRTPCs) was provided. The thermoplastic Polyamide 66 (PA66) with carbon fiber fabric was used. Using a dedicated software (JMag Designer), the influence of the fundamental process parameters such as temperature, current and holding time was investigated. In order to validate the results of the simulations, and therefore the numerical model used, experimental tests were carried out, and the temperature values measured during the tests were compared with the aid of an optical pyrometer, with those provided by the numerical simulation. The mechanical properties of the welded joints were evaluated by single lap shear tests.
Polytomous Rasch Models in Counseling Assessment
ERIC Educational Resources Information Center
Willse, John T.
2017-01-01
This article provides a brief introduction to the Rasch model. Motivation for using Rasch analyses is provided. Important Rasch model concepts and key aspects of result interpretation are introduced, with major points reinforced using a simulation demonstration. Concrete guidelines are provided regarding sample size and the evaluation of items.
Numerical Investigation of Dual-Mode Scramjet Combustor with Large Upstream Interaction
NASA Technical Reports Server (NTRS)
Mohieldin, T. O.; Tiwari, S. N.; Reubush, David E. (Technical Monitor)
2004-01-01
Dual-mode scramjet combustor configuration with significant upstream interaction is investigated numerically, The possibility of scaling the domain to accelerate the convergence and reduce the computational time is explored. The supersonic combustor configuration was selected to provide an understanding of key features of upstream interaction and to identify physical and numerical issues relating to modeling of dual-mode configurations. The numerical analysis was performed with vitiated air at freestream Math number of 2.5 using hydrogen as the sonic injectant. Results are presented for two-dimensional models and a three-dimensional jet-to-jet symmetric geometry. Comparisons are made with experimental results. Two-dimensional and three-dimensional results show substantial oblique shock train reaching upstream of the fuel injectors. Flow characteristics slow numerical convergence, while the upstream interaction slowly increases with further iterations. As the flow field develops, the symmetric assumption breaks down. A large separation zone develops and extends further upstream of the step. This asymmetric flow structure is not seen in the experimental data. Results obtained using a sub-scale domain (both two-dimensional and three-dimensional) qualitatively recover the flow physics obtained from full-scale simulations. All results show that numerical modeling using a scaled geometry provides good agreement with full-scale numerical results and experimental results for this configuration. This study supports the argument that numerical scaling is useful in simulating dual-mode scramjet combustor flowfields and could provide an excellent convergence acceleration technique for dual-mode simulations.
NASA Technical Reports Server (NTRS)
Kaupp, V. H.; Macdonald, H. C.; Waite, W. P.
1981-01-01
The initial phase of a program to determine the best interpretation strategy and sensor configuration for a radar remote sensing system for geologic applications is discussed. In this phase, terrain modeling and radar image simulation were used to perform parametric sensitivity studies. A relatively simple computer-generated terrain model is presented, and the data base, backscatter file, and transfer function for digital image simulation are described. Sets of images are presented that simulate the results obtained with an X-band radar from an altitude of 800 km and at three different terrain-illumination angles. The simulations include power maps, slant-range images, ground-range images, and ground-range images with statistical noise incorporated. It is concluded that digital image simulation and computer modeling provide cost-effective methods for evaluating terrain variations and sensor parameter changes, for predicting results, and for defining optimum sensor parameters.
Comparisons of NIF convergent ablation simulations with radiograph data.
Olson, R E; Hicks, D G; Meezan, N B; Koch, J A; Landen, O L
2012-10-01
A technique for comparing simulation results directly with radiograph data from backlit capsule implosion experiments will be discussed. Forward Abel transforms are applied to the kappa*rho profiles of the simulation. These provide the transmission ratio (optical depth) profiles of the simulation. Gaussian and top hat blurs are applied to the simulated transmission ratio profiles in order to account for the motion blurring and imaging slit resolution of the experimental measurement. Comparisons between the simulated transmission ratios and the radiograph data lineouts are iterated until a reasonable backlighter profile is obtained. This backlighter profile is combined with the blurred, simulated transmission ratios to obtain simulated intensity profiles that can be directly compared with the radiograph data. Examples will be shown from recent convergent ablation (backlit implosion) experiments at the NIF.
Faster Aerodynamic Simulation With Cart3D
NASA Technical Reports Server (NTRS)
2003-01-01
A NASA-developed aerodynamic simulation tool is ensuring the safety of future space operations while providing designers and engineers with an automated, highly accurate computer simulation suite. Cart3D, co-winner of NASA's 2002 Software of the Year award, is the result of over 10 years of research and software development conducted by Michael Aftosmis and Dr. John Melton of Ames Research Center and Professor Marsha Berger of the Courant Institute at New York University. Cart3D offers a revolutionary approach to computational fluid dynamics (CFD), the computer simulation of how fluids and gases flow around an object of a particular design. By fusing technological advancements in diverse fields such as mineralogy, computer graphics, computational geometry, and fluid dynamics, the software provides a new industrial geometry processing and fluid analysis capability with unsurpassed automation and efficiency.
System Verification of MSL Skycrane Using an Integrated ADAMS Simulation
NASA Technical Reports Server (NTRS)
White, Christopher; Antoun, George; Brugarolas, Paul; Lih, Shyh-Shiuh; Peng, Chia-Yen; Phan, Linh; San Martin, Alejandro; Sell, Steven
2012-01-01
Mars Science Laboratory (MSL) will use the Skycrane architecture to execute final descent and landing maneuvers. The Skycrane phase uses closed-loop feedback control throughout the entire phase, starting with rover separation, through mobility deploy, and through touchdown, ending only when the bridles have completely slacked. The integrated ADAMS simulation described in this paper couples complex dynamical models created by the mechanical subsystem with actual GNC flight software algorithms that have been compiled and linked into ADAMS. These integrated simulations provide the project with the best means to verify key Skycrane requirements which have a tightly coupled GNC-Mechanical aspect to them. It also provides the best opportunity to validate the design of the algorithm that determines when to cut the bridles. The results of the simulations show the excellent performance of the Skycrane system.
Moghadam, Soroush; Larson, Ronald G
2017-02-06
All-atom molecular dynamic simulations (AA-MD) are performed for aqueous solutions of hydrophobic drug molecules (phenytoin) with model polymer excipients, namely, (1) N-isopropylacrylamide, (pNIPAAm), (2) pNIPAAm-co-acrylamide (Am), and (3) pNIPAAm-co-dimethylacrylamide (DMA). After validating the force field parameters using the well-known lower critical solution behavior of pNIPAAm, we simulate the polymer-drug complex in water and its behavior at temperatures below (295 K) and above the LCST (310 K). Using radial distribution functions, we find that there is an optimum comonomer molar fraction of around 20-30% DMA at which interaction with phenytoin drug molecules is strongest, consistent with recent experimental findings. The results provide evidence that molecular simulations are able to provide guidance in the optimization of novel polymer excipients for drug release.
Network visualization of conformational sampling during molecular dynamics simulation.
Ahlstrom, Logan S; Baker, Joseph Lee; Ehrlich, Kent; Campbell, Zachary T; Patel, Sunita; Vorontsov, Ivan I; Tama, Florence; Miyashita, Osamu
2013-11-01
Effective data reduction methods are necessary for uncovering the inherent conformational relationships present in large molecular dynamics (MD) trajectories. Clustering algorithms provide a means to interpret the conformational sampling of molecules during simulation by grouping trajectory snapshots into a few subgroups, or clusters, but the relationships between the individual clusters may not be readily understood. Here we show that network analysis can be used to visualize the dominant conformational states explored during simulation as well as the connectivity between them, providing a more coherent description of conformational space than traditional clustering techniques alone. We compare the results of network visualization against 11 clustering algorithms and principal component conformer plots. Several MD simulations of proteins undergoing different conformational changes demonstrate the effectiveness of networks in reaching functional conclusions. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Christhilf, David M.; Pototzky, Anthony S.; Stevens, William L.
2010-01-01
The Simulink-based Simulation Architecture for Evaluating Controls for Aerospace Vehicles (SAREC-ASV) was modified to incorporate linear models representing aeroservoelastic characteristics of the SemiSpan SuperSonic Transport (S4T) wind-tunnel model. The S4T planform is for a Technology Concept Aircraft (TCA) design from the 1990s. The model has three control surfaces and is instrumented with accelerometers and strain gauges. Control laws developed for wind-tunnel testing for Ride Quality Enhancement, Gust Load Alleviation, and Flutter Suppression System functions were implemented in the simulation. The simulation models open- and closed-loop response to turbulence and to control excitation. It provides time histories for closed-loop stable conditions above the open-loop flutter boundary. The simulation is useful for assessing the potential impact of closed-loop control rate and position saturation. It also provides a means to assess fidelity of system identification procedures by providing time histories for a known plant model, with and without unmeasured turbulence as a disturbance. Sets of linear models representing different Mach number and dynamic pressure conditions were implemented as MATLAB Linear Time Invariant (LTI) objects. Configuration changes were implemented by selecting which LTI object to use in a Simulink template block. A limited comparison of simulation versus wind-tunnel results is shown.
A study of workstation computational performance for real-time flight simulation
NASA Technical Reports Server (NTRS)
Maddalon, Jeffrey M.; Cleveland, Jeff I., II
1995-01-01
With recent advances in microprocessor technology, some have suggested that modern workstations provide enough computational power to properly operate a real-time simulation. This paper presents the results of a computational benchmark, based on actual real-time flight simulation code used at Langley Research Center, which was executed on various workstation-class machines. The benchmark was executed on different machines from several companies including: CONVEX Computer Corporation, Cray Research, Digital Equipment Corporation, Hewlett-Packard, Intel, International Business Machines, Silicon Graphics, and Sun Microsystems. The machines are compared by their execution speed, computational accuracy, and porting effort. The results of this study show that the raw computational power needed for real-time simulation is now offered by workstations.
Study of CFB Simulation Model with Coincidence at Multi-Working Condition
NASA Astrophysics Data System (ADS)
Wang, Z.; He, F.; Yang, Z. W.; Li, Z.; Ni, W. D.
A circulating fluidized bed (CFB) two-stage simulation model was developed. To realize the model results coincident with the design value or real operation value at specified multi-working conditions and with capability of real-time calculation, only the main key processes were taken into account and the dominant factors were further abstracted out of these key processes. The simulation results showed a sound accordance at multi-working conditions, and confirmed the advantage of the two-stage model over the original single-stage simulation model. The combustion-support effect of secondary air was investigated using the two-stage model. This model provides a solid platform for investigating the pant-leg structured CFB furnace, which is now under design for a supercritical power plant.
NASA Technical Reports Server (NTRS)
Schulte, Peter Z.; Moore, James W.
2011-01-01
The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.
A Multi-Stage Method for Connecting Participatory Sensing and Noise Simulations
Hu, Mingyuan; Che, Weitao; Zhang, Qiuju; Luo, Qingli; Lin, Hui
2015-01-01
Most simulation-based noise maps are important for official noise assessment but lack local noise characteristics. The main reasons for this lack of information are that official noise simulations only provide information about expected noise levels, which is limited by the use of large-scale monitoring of noise sources, and are updated infrequently. With the emergence of smart cities and ubiquitous sensing, the possible improvements enabled by sensing technologies provide the possibility to resolve this problem. This study proposed an integrated methodology to propel participatory sensing from its current random and distributed sampling origins to professional noise simulation. The aims of this study were to effectively organize the participatory noise data, to dynamically refine the granularity of the noise features on road segments (e.g., different portions of a road segment), and then to provide a reasonable spatio-temporal data foundation to support noise simulations, which can be of help to researchers in understanding how participatory sensing can play a role in smart cities. This study first discusses the potential limitations of the current participatory sensing and simulation-based official noise maps. Next, we explain how participatory noise data can contribute to a simulation-based noise map by providing (1) spatial matching of the participatory noise data to the virtual partitions at a more microscopic level of road networks; (2) multi-temporal scale noise estimations at the spatial level of virtual partitions; and (3) dynamic aggregation of virtual partitions by comparing the noise values at the relevant temporal scale to form a dynamic segmentation of each road segment to support multiple spatio-temporal noise simulations. In this case study, we demonstrate how this method could play a significant role in a simulation-based noise map. Together, these results demonstrate the potential benefits of participatory noise data as dynamic input sources for noise simulations on multiple spatio-temporal scales. PMID:25621604
A multi-stage method for connecting participatory sensing and noise simulations.
Hu, Mingyuan; Che, Weitao; Zhang, Qiuju; Luo, Qingli; Lin, Hui
2015-01-22
Most simulation-based noise maps are important for official noise assessment but lack local noise characteristics. The main reasons for this lack of information are that official noise simulations only provide information about expected noise levels, which is limited by the use of large-scale monitoring of noise sources, and are updated infrequently. With the emergence of smart cities and ubiquitous sensing, the possible improvements enabled by sensing technologies provide the possibility to resolve this problem. This study proposed an integrated methodology to propel participatory sensing from its current random and distributed sampling origins to professional noise simulation. The aims of this study were to effectively organize the participatory noise data, to dynamically refine the granularity of the noise features on road segments (e.g., different portions of a road segment), and then to provide a reasonable spatio-temporal data foundation to support noise simulations, which can be of help to researchers in understanding how participatory sensing can play a role in smart cities. This study first discusses the potential limitations of the current participatory sensing and simulation-based official noise maps. Next, we explain how participatory noise data can contribute to a simulation-based noise map by providing (1) spatial matching of the participatory noise data to the virtual partitions at a more microscopic level of road networks; (2) multi-temporal scale noise estimations at the spatial level of virtual partitions; and (3) dynamic aggregation of virtual partitions by comparing the noise values at the relevant temporal scale to form a dynamic segmentation of each road segment to support multiple spatio-temporal noise simulations. In this case study, we demonstrate how this method could play a significant role in a simulation-based noise map. Together, these results demonstrate the potential benefits of participatory noise data as dynamic input sources for noise simulations on multiple spatio-temporal scales.
Traffic and Driving Simulator Based on Architecture of Interactive Motion.
Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza
2015-01-01
This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.
Traffic and Driving Simulator Based on Architecture of Interactive Motion
Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza
2015-01-01
This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711
Open Simulation Laboratories [Guest editors' introduction
Alexander, Francis J.; Meneveau, Charles
2015-09-01
The introduction for the special issue on open simulation laboratories, the guest editors describe how OSLs will become more common as their potential is better understood and they begin providing access to valuable datasets to much larger segments of the scientific community. Moreover, new analysis tools and ways to do science will inevitably develop as a result.
Haptic simulation framework for determining virtual dental occlusion.
Wu, Wen; Chen, Hui; Cen, Yuhai; Hong, Yang; Khambay, Balvinder; Heng, Pheng Ann
2017-04-01
The surgical treatment of many dentofacial deformities is often complex due to its three-dimensional nature. To determine the dental occlusion in the most stable position is essential for the success of the treatment. Computer-aided virtual planning on individualized patient-specific 3D model can help formulate the surgical plan and predict the surgical change. However, in current computer-aided planning systems, it is not possible to determine the dental occlusion of the digital models in the intuitive way during virtual surgical planning because of absence of haptic feedback. In this paper, a physically based haptic simulation framework is proposed, which can provide surgeons with the intuitive haptic feedback to determine the dental occlusion of the digital models in their most stable position. To provide the physically realistic force feedback when the dental models contact each other during the searching process, the contact model is proposed to describe the dynamic and collision properties of the dental models during the alignment. The simulated impulse/contact-based forces are integrated into the unified simulation framework. A validation study has been conducted on fifteen sets of virtual dental models chosen at random and covering a wide range of the dental relationships found clinically. The dental occlusions obtained by an expert were employed as a benchmark to compare the virtual occlusion results. The mean translational and angular deviations of the virtual occlusion results from the benchmark were small. The experimental results show the validity of our method. The simulated forces can provide valuable insights to determine the virtual dental occlusion. The findings of this work and the validation of proposed concept lead the way for full virtual surgical planning on patient-specific virtual models allowing fully customized treatment plans for the surgical correction of dentofacial deformities.
Updates to Multi-Dimensional Flux Reconstruction for Hypersonic Simulations on Tetrahedral Grids
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2010-01-01
The quality of simulated hypersonic stagnation region heating with tetrahedral meshes is investigated by using an updated three-dimensional, upwind reconstruction algorithm for the inviscid flux vector. An earlier implementation of this algorithm provided improved symmetry characteristics on tetrahedral grids compared to conventional reconstruction methods. The original formulation however displayed quantitative differences in heating and shear that were as large as 25% compared to a benchmark, structured-grid solution. The primary cause of this discrepancy is found to be an inherent inconsistency in the formulation of the flux limiter. The inconsistency is removed by employing a Green-Gauss formulation of primitive gradients at nodes to replace the previous Gram-Schmidt algorithm. Current results are now in good agreement with benchmark solutions for two challenge problems: (1) hypersonic flow over a three-dimensional cylindrical section with special attention to the uniformity of the solution in the spanwise direction and (2) hypersonic flow over a three-dimensional sphere. The tetrahedral cells used in the simulation are derived from a structured grid where cell faces are bisected across the diagonal resulting in a consistent pattern of diagonals running in a biased direction across the otherwise symmetric domain. This grid is known to accentuate problems in both shock capturing and stagnation region heating encountered with conventional, quasi-one-dimensional inviscid flux reconstruction algorithms. Therefore the test problems provide a sensitive indicator for algorithmic effects on heating. Additional simulations on a sharp, double cone and the shuttle orbiter are then presented to demonstrate the capabilities of the new algorithm on more geometrically complex flows with tetrahedral grids. These results provide the first indication that pure tetrahedral elements utilizing the updated, three-dimensional, upwind reconstruction algorithm may be used for the simulation of heating and shear in hypersonic flows in upwind, finite volume formulations.
Kapitán, Josef; Johannessen, Christian; Bour, Petr; Hecht, Lutz; Barron, Laurence D
2009-01-01
The samples used for the first observations of vibrational Raman optical activity (ROA) in 1972, namely both enantiomers of 1-phenylethanol and 1-phenylethylamine, have been revisited using a modern commercial ROA instrument together with state-of-the-art ab initio calculations. The simulated ROA spectra reveal for the first time the vibrational origins of the first reported ROA signals, which comprised similar couplets in the alcohol and amine in the spectral range approximately 280-400 cm(-1). The results demonstrate how easy and routine ROA measurements have become, and how current ab initio quantum-chemical calculations are capable of simulating experimental ROA spectra quite closely provided sufficient averaging over accessible conformations is included. Assignment of absolute configuration is, inter alia, completely secure from results of this quality. Anharmonic corrections provided small improvements in the simulated Raman and ROA spectra. The importance of conformational averaging emphasized by this and previous related work provides the underlying theoretical background to ROA studies of dynamic aspects of chiral molecular and biomolecular structure and behavior. (c) 2009 Wiley-Liss, Inc.
Salipur, Zdravko; Bertocci, Gina
2010-01-01
It has been shown that ANSI WC19 transit wheelchairs that are crashworthy in frontal impact exhibit catastrophic failures in rear impact and may not be able to provide stable seating support and thus occupant protection for the wheelchair occupant. Thus far only limited sled test and computer simulation data have been available to study rear impact wheelchair safety. Computer modeling can be used as an economic and comprehensive tool to gain critical knowledge regarding wheelchair integrity and occupant safety. This study describes the development and validation of a computer model simulating an adult wheelchair-seated occupant subjected to a rear impact event. The model was developed in MADYMO and validated rigorously using the results of three similar sled tests conducted to specifications provided in the draft ISO/TC 173 standard. Outcomes from the model can provide critical wheelchair loading information to wheelchair and tiedown manufacturers, resulting in safer wheelchair designs for rear impact conditions. (c) 2009 IPEM. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Vaughan, O. H., Jr.; Hung, R. J.
1975-01-01
Skylab 4 crew members performed a series of demonstrations showing the oscillations, rotations, as well as collision coalescence of water droplets which simulate various physical models of fluids under low gravity environment. The results from Skylab demonstrations provide information and illustrate the potential of an orbiting space-oriented research laboratory for the study of more sophisticated fluid mechanic experiments. Experiments and results are discussed.
Cascade Defect Evolution Processes: Comparison of Atomistic Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Haixuan; Stoller, Roger E; Osetskiy, Yury N
2013-11-01
Determining the defect evolution beyond the molecular dynamics (MD) time scale is critical in bridging the gap between atomistic simulations and experiments. The recently developed self-evolving atomistic kinetic Monte Carlo (SEAKMC) method provides new opportunities to simulate long-term defect evolution with MD-like fidelity. In this study, SEAKMC is applied to investigate the cascade defect evolution in bcc iron. First, the evolution of a vacancy rich region is simulated and compared with results obtained using autonomous basin climbing (ABC) +KMC and kinetic activation-relaxation technique (kART) simulations. Previously, it is found the results from kART are orders of magnitude faster than ABC+KMC.more » The results obtained from SEAKMC are similar to kART but the time predicted is about one order of magnitude faster than kART. The fidelity of SEAKMC is confirmed by statistically relevant MD simulations at multiple higher temperatures, which proves that the saddle point sampling is close to complete in SEAKMC. The second is the irradiation-induced formation of C15 Laves phase nano-size defect clusters. In contrast to previous studies, which claim the defects can grow by capturing self-interstitials, we found these highly stable clusters can transform to <111> glissile configuration on a much longer time scale. Finally, cascade-annealing simulations using SEAKMC is compared with traditional object KMC (OKMC) method. SEAKMC predicts substantially fewer surviving defects compared with OKMC. The possible origin of this difference is discussed and a possible way to improve the accuracy of OKMC based on SEAKMC results is outlined. These studies demonstrate the atomistic fidelity of SEAKMC in comparison with other on-the-fly KMC methods and provide new information on long-term defect evolution in iron.« less
Savolainen, Peter T
2016-11-01
This study involves an examination of driver behavior at the onset of a yellow signal indication. Behavioral data were obtained from a driving simulator study that was conducted through the National Advanced Driving Simulator (NADS) laboratory at the University of Iowa. These data were drawn from a series of events during which study participants drove through a series of intersections where the traffic signals changed from the green to yellow phase. The resulting dataset provides potential insights into how driver behavior is affected by distracted driving through an experimental design that alternated handheld, headset, and hands-free cell phone use with "normal" baseline driving events. The results of the study show that male drivers ages 18-45 were more likely to stop. Participants were also more likely to stop as they became more familiar with the simulator environment. Cell phone use was found to some influence on driver behavior in this setting, though the effects varied significantly across individuals. The study also demonstrates two methodological approaches for dealing with unobserved heterogeneity across drivers. These include random parameters and latent class logit models, each of which analyze the data as a panel. The results show each method to provide significantly better fit than a pooled, fixed parameter model. Differences in terms of the context of these two approaches are discussed, providing important insights as to the differences between these modeling frameworks. Copyright © 2016 Elsevier Ltd. All rights reserved.
Numerical Investigation of Plasma Detachment in Magnetic Nozzle Experiments
NASA Technical Reports Server (NTRS)
Sankaran, Kamesh; Polzin, Kurt A.
2008-01-01
At present there exists no generally accepted theoretical model that provides a consistent physical explanation of plasma detachment from an externally-imposed magnetic nozzle. To make progress towards that end, simulation of plasma flow in the magnetic nozzle of an arcjet experiment is performed using a multidimensional numerical simulation tool that includes theoretical models of the various dispersive and dissipative processes present in the plasma. This is an extension of the simulation tool employed in previous work by Sankaran et al. The aim is to compare the computational results with various proposed magnetic nozzle detachment theories to develop an understanding of the physical mechanisms that cause detachment. An applied magnetic field topology is obtained using a magnetostatic field solver (see Fig. I), and this field is superimposed on the time-dependent magnetic field induced in the plasma to provide a self-consistent field description. The applied magnetic field and model geometry match those found in experiments by Kuriki and Okada. This geometry is modeled because there is a substantial amount of experimental data that can be compared to the computational results, allowing for validation of the model. In addition, comparison of the simulation results with the experimentally obtained plasma parameters will provide insight into the mechanisms that lead to plasma detachment, revealing how they scale with different input parameters. Further studies will focus on modeling literature experiments both for the purpose of additional code validation and to extract physical insight regarding the mechanisms driving detachment.
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.
2011-12-01
In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.
Validating clustering of molecular dynamics simulations using polymer models.
Phillips, Joshua L; Colvin, Michael E; Newsam, Shawn
2011-11-14
Molecular dynamics (MD) simulation is a powerful technique for sampling the meta-stable and transitional conformations of proteins and other biomolecules. Computational data clustering has emerged as a useful, automated technique for extracting conformational states from MD simulation data. Despite extensive application, relatively little work has been done to determine if the clustering algorithms are actually extracting useful information. A primary goal of this paper therefore is to provide such an understanding through a detailed analysis of data clustering applied to a series of increasingly complex biopolymer models. We develop a novel series of models using basic polymer theory that have intuitive, clearly-defined dynamics and exhibit the essential properties that we are seeking to identify in MD simulations of real biomolecules. We then apply spectral clustering, an algorithm particularly well-suited for clustering polymer structures, to our models and MD simulations of several intrinsically disordered proteins. Clustering results for the polymer models provide clear evidence that the meta-stable and transitional conformations are detected by the algorithm. The results for the polymer models also help guide the analysis of the disordered protein simulations by comparing and contrasting the statistical properties of the extracted clusters. We have developed a framework for validating the performance and utility of clustering algorithms for studying molecular biopolymer simulations that utilizes several analytic and dynamic polymer models which exhibit well-behaved dynamics including: meta-stable states, transition states, helical structures, and stochastic dynamics. We show that spectral clustering is robust to anomalies introduced by structural alignment and that different structural classes of intrinsically disordered proteins can be reliably discriminated from the clustering results. To our knowledge, our framework is the first to utilize model polymers to rigorously test the utility of clustering algorithms for studying biopolymers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirley, Rachel; Smidts, Carol; Boring, Ronald
Information-Decision-Action Crew (IDAC) operator model simulations of a Steam Generator Tube Rupture are compared to student operator performance in studies conducted in the Ohio State University’s Nuclear Power Plant Simulator Facility. This study is presented as a prototype for conducting simulator studies to validate key aspects of Human Reliability Analysis (HRA) methods. Seven student operator crews are compared to simulation results for crews designed to demonstrate three different decision-making strategies. The IDAC model used in the simulations is modified slightly to capture novice behavior rather that expert operators. Operator actions and scenario pacing are compared. A preliminary review of availablemore » performance shaping factors (PSFs) is presented. After the scenario in the NPP Simulator Facility, student operators review a video of the scenario and evaluate six PSFs at pre-determined points in the scenario. This provides a dynamic record of the PSFs experienced by the OSU student operators. In this preliminary analysis, Time Constraint Load (TCL) calculated in the IDAC simulations is compared to TCL reported by student operators. We identify potential modifications to the IDAC model to develop an “IDAC Student Operator Model.” This analysis provides insights into how similar experiments could be conducted using expert operators to improve the fidelity of IDAC simulations.« less
NASA Astrophysics Data System (ADS)
Soti, G.; Wauters, F.; Breitenfeldt, M.; Finlay, P.; Kraev, I. S.; Knecht, A.; Porobić, T.; Zákoucký, D.; Severijns, N.
2013-11-01
Geant4 simulations play a crucial role in the analysis and interpretation of experiments providing low energy precision tests of the Standard Model. This paper focuses on the accuracy of the description of the electron processes in the energy range between 100 and 1000 keV. The effect of the different simulation parameters and multiple scattering models on the backscattering coefficients is investigated. Simulations of the response of HPGe and passivated implanted planar Si detectors to β particles are compared to experimental results. An overall good agreement is found between Geant4 simulations and experimental data.
NASA Astrophysics Data System (ADS)
Valasek, Lukas; Glasa, Jan
2017-12-01
Current fire simulation systems are capable to utilize advantages of high-performance computer (HPC) platforms available and to model fires efficiently in parallel. In this paper, efficiency of a corridor fire simulation on a HPC computer cluster is discussed. The parallel MPI version of Fire Dynamics Simulator is used for testing efficiency of selected strategies of allocation of computational resources of the cluster using a greater number of computational cores. Simulation results indicate that if the number of cores used is not equal to a multiple of the total number of cluster node cores there are allocation strategies which provide more efficient calculations.
A MEMS disk resonator-based band pass filter electrical equivalent circuit simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundaram, G. M.; Angira, Mahesh; Gupta, Navneet
In this paper, coupled beam bandpass Disk filter is designed for 1 MHz bandwidth. Filter electrical equivalent circuit simulation is performed using circuit simulators. Important filter parameters such as insertion loss, shape factor and Q factor aresetimated using coventorware simulation. Disk resonator based radial contour mode filter provides 1.5 MHz bandwidth and unloaded quality factor of resonator and filter as 233480, 21797 respectively. From the simulation result it’s found that insertion loss minimum is 151.49 dB, insertion loss maximum is 213.94 dB, and 40 dB shape factor is 4.17.
Tait, Lauren; Lee, Kenneth; Rasiah, Rohan; Cooper, Joyce M; Ling, Tristan; Geelan, Benjamin; Bindoff, Ivan
2018-05-03
Background . There are numerous approaches to simulating a patient encounter in pharmacy education. However, little direct comparison between these approaches has been undertaken. Our objective was to investigate student experiences, satisfaction, and feedback preferences between three scenario simulation modalities (paper-, actor-, and computer-based). Methods . We conducted a mixed methods study with randomized cross-over of simulation modalities on final-year Australian graduate-entry Master of Pharmacy students. Participants completed case-based scenarios within each of three simulation modalities, with feedback provided at the completion of each scenario in a format corresponding to each simulation modality. A post-simulation questionnaire collected qualitative and quantitative responses pertaining to participant satisfaction, experiences, and feedback preferences. Results . Participants reported similar levels satisfaction across all three modalities. However, each modality resulted in unique positive and negative experiences, such as student disengagement with paper-based scenarios. Conclusion . Importantly, the themes of guidance and opportunity for peer discussion underlie the best forms of feedback for students. The provision of feedback following simulation should be carefully considered and delivered, with all three simulation modalities producing both positive and negative experiences in regard to their feedback format.
Vanegas, Carlos A; Aliaga, Daniel G; Benes, Bedrich; Waddell, Paul
2009-01-01
Urban simulation models and their visualization are used to help regional planning agencies evaluate alternative transportation investments, land use regulations, and environmental protection policies. Typical urban simulations provide spatially distributed data about number of inhabitants, land prices, traffic, and other variables. In this article, we build on a synergy of urban simulation, urban visualization, and computer graphics to automatically infer an urban layout for any time step of the simulation sequence. In addition to standard visualization tools, our method gathers data of the original street network, parcels, and aerial imagery and uses the available simulation results to infer changes to the original urban layout and produce a new and plausible layout for the simulation results. In contrast with previous work, our approach automatically updates the layout based on changes in the simulation data and thus can scale to a large simulation over many years. The method in this article offers a substantial step forward in building integrated visualization and behavioral simulation systems for use in community visioning, planning, and policy analysis. We demonstrate our method on several real cases using a 200 GB database for a 16,300 km2 area surrounding Seattle.
Numerical Simulations of a Jet–Cloud Collision and Starburst: Application to Minkowski’s Object
Fragile, P. Chris; Anninos, Peter; Croft, Steve; ...
2017-11-30
In this work, we present results of three-dimensional, multi-physics simulations of an AGN jet colliding with an intergalactic cloud. The purpose of these simulations is to assess the degree of "positive feedback," i.e., jet-induced star formation, that results. We have specifically tailored our simulation parameters to facilitate a comparison with recent observations of Minkowski's Object (MO), a stellar nursery located at the termination point of a radio jet coming from galaxy NGC 541. As shown in our simulations, such a collision triggers shocks, which propagate around and through the cloud. These shocks condense the gas and under the right circumstancesmore » may trigger cooling instabilities, creating runaway increases in density, to the point that individual clumps can become Jeans unstable. Our simulations provide information about the expected star formation rate, total mass converted to H I, H 2, and stars, and the relative velocity of the stars and gas. Finally, our results confirm the possibility of jet-induced star formation, and agree well with the observations of MO.« less
NASA Astrophysics Data System (ADS)
Ho, Teck Seng; Charles, Christine; Boswell, Roderick W.
2016-12-01
This paper presents computational fluid dynamics simulations of the cold gas operation of Pocket Rocket and Mini Pocket Rocket radiofrequency electrothermal microthrusters, replicating experiments performed in both sub-Torr and vacuum environments. This work takes advantage of flow velocity choking to circumvent the invalidity of modelling vacuum regions within a CFD simulation, while still preserving the accuracy of the desired results in the internal regions of the microthrusters. Simulated results of the plenum stagnation pressure is in precise agreement with experimental measurements when slip boundary conditions with the correct tangential momentum accommodation coefficients for each gas are used. Thrust and specific impulse is calculated by integrating the flow profiles at the exit of the microthrusters, and are in good agreement with experimental pendulum thrust balance measurements and theoretical expectations. For low thrust conditions where experimental instruments are not sufficiently sensitive, these cold gas simulations provide additional data points against which experimental results can be verified and extrapolated. The cold gas simulations presented in this paper will be used as a benchmark to compare with future plasma simulations of the Pocket Rocket microthruster.
Numerical Simulations of a Jet–Cloud Collision and Starburst: Application to Minkowski’s Object
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fragile, P. Chris; Anninos, Peter; Croft, Steve
In this work, we present results of three-dimensional, multi-physics simulations of an AGN jet colliding with an intergalactic cloud. The purpose of these simulations is to assess the degree of "positive feedback," i.e., jet-induced star formation, that results. We have specifically tailored our simulation parameters to facilitate a comparison with recent observations of Minkowski's Object (MO), a stellar nursery located at the termination point of a radio jet coming from galaxy NGC 541. As shown in our simulations, such a collision triggers shocks, which propagate around and through the cloud. These shocks condense the gas and under the right circumstancesmore » may trigger cooling instabilities, creating runaway increases in density, to the point that individual clumps can become Jeans unstable. Our simulations provide information about the expected star formation rate, total mass converted to H I, H 2, and stars, and the relative velocity of the stars and gas. Finally, our results confirm the possibility of jet-induced star formation, and agree well with the observations of MO.« less
Numerical Simulations of a Jet-Cloud Collision and Starburst: Application to Minkowski’s Object
NASA Astrophysics Data System (ADS)
Fragile, P. Chris; Anninos, Peter; Croft, Steve; Lacy, Mark; Witry, Jason W. L.
2017-12-01
We present results of three-dimensional, multi-physics simulations of an AGN jet colliding with an intergalactic cloud. The purpose of these simulations is to assess the degree of “positive feedback,” i.e., jet-induced star formation, that results. We have specifically tailored our simulation parameters to facilitate a comparison with recent observations of Minkowski’s Object (MO), a stellar nursery located at the termination point of a radio jet coming from galaxy NGC 541. As shown in our simulations, such a collision triggers shocks, which propagate around and through the cloud. These shocks condense the gas and under the right circumstances may trigger cooling instabilities, creating runaway increases in density, to the point that individual clumps can become Jeans unstable. Our simulations provide information about the expected star formation rate, total mass converted to H I, H2, and stars, and the relative velocity of the stars and gas. Our results confirm the possibility of jet-induced star formation, and agree well with the observations of MO.
An experimental method for the assessment of color simulation tools.
Lillo, Julio; Alvaro, Leticia; Moreira, Humberto
2014-07-22
The Simulcheck method for evaluating the accuracy of color simulation tools in relation to dichromats is described and used to test three color simulation tools: Variantor, Coblis, and Vischeck. A total of 10 dichromats (five protanopes, five deuteranopes) and 10 normal trichromats participated in the current study. Simulcheck includes two psychophysical tasks: the Pseudoachromatic Stimuli Identification task and the Minimum Achromatic Contrast task. The Pseudoachromatic Stimuli Identification task allows determination of the two chromatic angles (h(uv) values) that generate a minimum response in the yellow–blue opponent mechanism and, consequently, pseudoachromatic stimuli (greens or reds). The Minimum Achromatic Contrast task requires the selection of the gray background that produces minimum contrast (near zero change in the achromatic mechanism) for each pseudoachromatic stimulus selected in the previous task (L(R) values). Results showed important differences in the colorimetric transformations performed by the three evaluated simulation tools and their accuracy levels. Vischeck simulation accurately implemented the algorithm of Brettel, Viénot, and Mollon (1997). Only Vischeck appeared accurate (similarity in huv and L(R) values between real and simulated dichromats) and, consequently, could render reliable color selections. It is concluded that Simulcheck is a consistent method because it provided an equivalent pattern of results for huv and L(R) values irrespective of the stimulus set used to evaluate a simulation tool. Simulcheck was also considered valid because real dichromats provided expected huv and LR values when performing the two psychophysical tasks included in this method. © 2014 ARVO.
Validation of virtual-reality-based simulations for endoscopic sinus surgery.
Dharmawardana, N; Ruthenbeck, G; Woods, C; Elmiyeh, B; Diment, L; Ooi, E H; Reynolds, K; Carney, A S
2015-12-01
Virtual reality (VR) simulators provide an alternative to real patients for practicing surgical skills but require validation to ensure accuracy. Here, we validate the use of a virtual reality sinus surgery simulator with haptic feedback for training in Otorhinolaryngology - Head & Neck Surgery (OHNS). Participants were recruited from final-year medical students, interns, resident medical officers (RMOs), OHNS registrars and consultants. All participants completed an online questionnaire after performing four separate simulation tasks. These were then used to assess face, content and construct validity. anova with post hoc correlation was used for statistical analysis. The following groups were compared: (i) medical students/interns, (ii) RMOs, (iii) registrars and (iv) consultants. Face validity results had a statistically significant (P < 0.05) difference between the consultant group and others, while there was no significant difference between medical student/intern and RMOs. Variability within groups was not significant. Content validity results based on consultant scoring and comments indicated that the simulations need further development in several areas to be effective for registrar-level teaching. However, students, interns and RMOs indicated that the simulations provide a useful tool for learning OHNS-related anatomy and as an introduction to ENT-specific procedures. The VR simulations have been validated for teaching sinus anatomy and nasendoscopy to medical students, interns and RMOs. However, they require further development before they can be regarded as a valid tool for more advanced surgical training. © 2015 John Wiley & Sons Ltd.
Medication waste reduction in pediatric pharmacy batch processes.
Toerper, Matthew F; Veltri, Michael A; Hamrock, Eric; Mollenkopf, Nicole L; Holt, Kristen; Levin, Scott
2014-04-01
To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste.
Rietsch, Stefan H G; Quick, Harald H; Orzada, Stephan
2015-08-01
In this work, the transmit performance and interelement coupling characteristics of radio frequency (RF) antenna microstrip line elements are examined in simulations and measurements. The initial point of the simulations is a microstrip line element loaded with a phantom. Meander structures are then introduced at the end of the element. The size of the meanders is increased in fixed steps and the magnetic field is optimized. In continuative simulations, the coupling between identical elements is evaluated for different element spacing and loading conditions. Verification of the simulation results is accomplished in measurements of the coupling between two identical elements for four different meander sizes. Image acquisition on a 7 T magnetic resonance imaging (MRI) system provides qualitative and quantitative comparisons to confirm the simulation results. Simulations point out an optimum range of meander sizes concerning coupling in all chosen geometric setups. Coupling measurement results are in good agreement with the simulations. Qualitative and quantitative comparisons of the acquired MRI images substantiate the coupling results. The coupling between coil elements in RF antenna arrays consisting of the investigated element types can be optimized under consideration of the central magnetic field strength or efficiency depending on the desired application.
NASA Technical Reports Server (NTRS)
Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.
2005-01-01
This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.
Response of Flight Nurses in a Simulated Helicopter Environment.
Kaniecki, David M; Hickman, Ronald L; Alfes, Celeste M; Reimer, Andrew P
The purpose of this study was to determine if a helicopter flight simulator could provide a useful educational platform by creating experiences similar to those encountered by actual flight nurses. Flight nurse (FN) and non-FN participants completed a simulated emergency scenario in a flight simulator. Physiologic and psychological stress during the simulation was measured using heart rate and perceived stress scores. A questionnaire was then administered to assess the realism of the flight simulator. Subjects reported that the overall experience in the flight simulator was comparable with a real helicopter. Sounds, communications, vibrations, and movements in the simulator most approximated those of a real-life helicopter environment. Perceived stress levels of all participants increased significantly from 27 (on a 0-100 scale) before simulation to 51 at the peak of the simulation and declined thereafter to 28 (P < .001). Perceived stress levels of FNs increased significantly from 25 before simulation to 54 at the peak of the simulation and declined thereafter to 30 (P < .001). Perceived stress levels of non-FNs increased significantly from 31 before simulation to 49 at the peak of the simulation and declined thereafter to 25 (P < .001). There were no significant differences in perceived stress levels between FNs and non-FNs before (P = .58), during (P = .63), or after (P = .55) simulation. FNs' heart rates increased significantly from 77 before simulation to 100 at the peak of the simulation and declined thereafter to 72 (P < .001). The results of this study suggest that simulation of a critical care scenario in a high-fidelity helicopter flight simulator can provide a realistic helicopter transport experience and create physiologic and psychological stress for participants. Copyright © 2017 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.
De Biase, Pablo M.; Markosyan, Suren; Noskov, Sergei
2014-01-01
We developed a novel scheme based on the Grand-Canonical Monte-Carlo/Brownian Dynamics (GCMC/BD) simulations and have extended it to studies of ion currents across three nanopores with the potential for ssDNA sequencing: solid-state nanopore Si3N4, α-hemolysin, and E111N/M113Y/K147N mutant. To describe nucleotide-specific ion dynamics compatible with ssDNA coarse-grained model, we used the Inverse Monte-Carlo protocol, which maps the relevant ion-nucleotide distribution functions from an all-atom MD simulations. Combined with the previously developed simulation platform for Brownian Dynamic (BD) simulations of ion transport, it allows for microsecond- and millisecond-long simulations of ssDNA dynamics in nanopore with a conductance computation accuracy that equals or exceeds that of all-atom MD simulations. In spite of the simplifications, the protocol produces results that agree with the results of previous studies on ion conductance across open channels and provide direct correlations with experimentally measured blockade currents and ion conductances that have been estimated from all-atom MD simulations. PMID:24738152
Performance of the NASA Airborne Radar with the Windshear Database for Forward-Looking Systems
NASA Technical Reports Server (NTRS)
Switzer, George F.; Britt, Charles L.
1996-01-01
This document describes the simulation approach used to test the performance of the NASA airborne windshear radar. An explanation of the actual radar hardware and processing algorithms provides an understanding of the parameters used in the simulation program. This report also contains a brief overview of the NASA airborne windshear radar experimental flight test results. A description of the radar simulation program shows the capabilities of the program and the techniques used for certification evaluation. Simulation of the NASA radar is comprised of three steps. First, the choice of the ground clutter data must be made. The ground clutter is the return from objects in or nearby an airport facility. The choice of the ground clutter also dictates the aircraft flight path since ground clutter is gathered while in flight. The second step is the choice of the radar parameters and the running of the simulation program which properly combines the ground clutter data with simulated windshear weather data. The simulated windshear weather data is comprised of a number of Terminal Area Simulation System (TASS) model results. The final step is the comparison of the radar simulation results to the known windshear data base. The final evaluation of the radar simulation is based on the ability to detect hazardous windshear with the aircraft at a safe distance while at the same time not displaying false alerts.
Yılmaz, Bülent; Çiftçi, Emre
2013-06-01
Extracorporeal Shock Wave Lithotripsy (ESWL) is based on disintegration of the kidney stone by delivering high-energy shock waves that are created outside the body and transmitted through the skin and body tissues. Nowadays high-energy shock waves are also used in orthopedic operations and investigated to be used in the treatment of myocardial infarction and cancer. Because of these new application areas novel lithotriptor designs are needed for different kinds of treatment strategies. In this study our aim was to develop a versatile computer simulation environment which would give the device designers working on various medical applications that use shock wave principle a substantial amount of flexibility while testing the effects of new parameters such as reflector size, material properties of the medium, water temperature, and different clinical scenarios. For this purpose, we created a finite-difference time-domain (FDTD)-based computational model in which most of the physical system parameters were defined as an input and/or as a variable in the simulations. We constructed a realistic computational model of a commercial electrohydraulic lithotriptor and optimized our simulation program using the results that were obtained by the manufacturer in an experimental setup. We, then, compared the simulation results with the results from an experimental setup in which oxygen level in water was varied. Finally, we studied the effects of changing the input parameters like ellipsoid size and material, temperature change in the wave propagation media, and shock wave source point misalignment. The simulation results were consistent with the experimental results and expected effects of variation in physical parameters of the system. The results of this study encourage further investigation and provide adequate evidence that the numerical modeling of a shock wave therapy system is feasible and can provide a practical means to test novel ideas in new device design procedures. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Tensile strength of simulated and welded butt joints in W-Cu composite sheet
NASA Technical Reports Server (NTRS)
Moore, Thomas J.; Watson, Gordon K.
1994-01-01
The weldability of W-Cu composite sheet was investigated using simulated and welded joints. The welded joints were produced in a vacuum hot press. Tensile test results showed that simulated joints can provide strength and failure mode data which can be used in joint design for actual weldments. Although all of the welded joints had flaws, a number of these joints were as strong as the W-Cu composite base material.
Capabilities of stochastic rainfall models as data providers for urban hydrology
NASA Astrophysics Data System (ADS)
Haberlandt, Uwe
2017-04-01
For planning of urban drainage systems using hydrological models, long, continuous precipitation series with high temporal resolution are needed. Since observed time series are often too short or not available everywhere, the use of synthetic precipitation is a common alternative. This contribution compares three precipitation models regarding their suitability to provide 5 minute continuous rainfall time series for a) sizing of drainage networks for urban flood protection and b) dimensioning of combined sewage systems for pollution reduction. The rainfall models are a parametric stochastic model (Haberlandt et al., 2008), a non-parametric probabilistic approach (Bárdossy, 1998) and a stochastic downscaling of dynamically simulated rainfall (Berg et al., 2013); all models are operated both as single site and multi-site generators. The models are applied with regionalised parameters assuming that there is no station at the target location. Rainfall and discharge characteristics are utilised for evaluation of the model performance. The simulation results are compared against results obtained from reference rainfall stations not used for parameter estimation. The rainfall simulations are carried out for the federal states of Baden-Württemberg and Lower Saxony in Germany and the discharge simulations for the drainage networks of the cities of Hamburg, Brunswick and Freiburg. Altogether, the results show comparable simulation performance for the three models, good capabilities for single site simulations but low skills for multi-site simulations. Remarkably, there is no significant difference in simulation performance comparing the tasks flood protection with pollution reduction, so the models are finally able to simulate both the extremes and the long term characteristics of rainfall equally well. Bárdossy, A., 1998. Generating precipitation time series using simulated annealing. Wat. Resour. Res., 34(7): 1737-1744. Berg, P., Wagner, S., Kunstmann, H., Schädler, G., 2013. High resolution regional climate model simulations for Germany: part I — validation. Climate Dynamics, 40(1): 401-414. Haberlandt, U., Ebner von Eschenbach, A.-D., Buchwald, I., 2008. A space-time hybrid hourly rainfall model for derived flood frequency analysis. Hydrol. Earth Syst. Sci., 12: 1353-1367.
NASA Astrophysics Data System (ADS)
Mosumgaard, Jakob Rørsted; Ball, Warrick H.; Aguirre, Víctor Silva; Weiss, Achim; Christensen-Dalsgaard, Jørgen
2018-06-01
Stellar evolution codes play a major role in present-day astrophysics, yet they share common simplifications related to the outer layers of stars. We seek to improve on this by the use of results from realistic and highly detailed 3D hydrodynamics simulations of stellar convection. We implement a temperature stratification extracted directly from the 3D simulations into two stellar evolution codes to replace the simplified atmosphere normally used. Our implementation also contains a non-constant mixing-length parameter, which varies as a function of the stellar surface gravity and temperature - also derived from the 3D simulations. We give a detailed account of our fully consistent implementation and compare to earlier works, and also provide a freely available MESA-module. The evolution of low-mass stars with different masses is investigated, and we present for the first time an asteroseismic analysis of a standard solar model utilising calibrated convection and temperature stratification from 3D simulations. We show that the inclusion of 3D results have an almost insignificant impact on the evolution and structure of stellar models - the largest effect are changes in effective temperature of order 30 K seen in the pre-main sequence and in the red-giant branch. However, this work provides the first step for producing self-consistent evolutionary calculations using fully incorporated 3D atmospheres from on-the-fly interpolation in grids of simulations.
Control and Communication for a Secure and Reconfigurable Power Distribution System
NASA Astrophysics Data System (ADS)
Giacomoni, Anthony Michael
A major transformation is taking place throughout the electric power industry to overlay existing electric infrastructure with advanced sensing, communications, and control system technologies. This transformation to a smart grid promises to enhance system efficiency, increase system reliability, support the electrification of transportation, and provide customers with greater control over their electricity consumption. Upgrading control and communication systems for the end-to-end electric power grid, however, will present many new security challenges that must be dealt with before extensive deployment and implementation of these technologies can begin. In this dissertation, a comprehensive systems approach is taken to minimize and prevent cyber-physical disturbances to electric power distribution systems using sensing, communications, and control system technologies. To accomplish this task, an intelligent distributed secure control (IDSC) architecture is presented and validated in silico for distribution systems to provide greater adaptive protection, with the ability to proactively reconfigure, and rapidly respond to disturbances. Detailed descriptions of functionalities at each layer of the architecture as well as the whole system are provided. To compare the performance of the IDSC architecture with that of other control architectures, an original simulation methodology is developed. The simulation model integrates aspects of cyber-physical security, dynamic price and demand response, sensing, communications, intermittent distributed energy resources (DERs), and dynamic optimization and reconfiguration. Applying this comprehensive systems approach, performance results for the IEEE 123 node test feeder are simulated and analyzed. The results show the trade-offs between system reliability, operational constraints, and costs for several control architectures and optimization algorithms. Additional simulation results are also provided. In particular, the advantages of an IDSC architecture are highlighted when an intermittent DER is present on the system.
NASA Technical Reports Server (NTRS)
Grantham, William D.
1989-01-01
The primary objective was to provide information to the flight controls/flying qualities engineer that will assist him in determining the incremental flying qualities and/or pilot-performance differences that may be expected between results obtained via ground-based simulation (and, in particular, the six-degree-of-freedom Langley Visual/Motion Simulator (VMS)) and flight tests. Pilot opinion and performance parameters derived from a ground-based simulator and an in-flight simulator are compared for a jet-transport airplane having 32 different longitudinal dynamic response characteristics. The primary pilot tasks were the approach and landing tasks with emphasis on the landing-flare task. The results indicate that, in general, flying qualities results obtained from the ground-based simulator may be considered conservative-especially when the pilot task requires tight pilot control as during the landing flare. The one exception to this, according to the present study, was that the pilots were more tolerant of large time delays in the airplane response on the ground-based simulator. The results also indicated that the ground-based simulator (particularly the Langley VMS) is not adequate for assessing pilot/vehicle performance capabilities (i.e., the sink rate performance for the landing-flare task when the pilot has little depth/height perception from the outside scene presentation).
IMPROVING TACONITE PROCESSING PLANT EFFICIENCY BY COMPUTER SIMULATION, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
William M. Bond; Salih Ersayin
2007-03-30
This project involved industrial scale testing of a mineral processing simulator to improve the efficiency of a taconite processing plant, namely the Minorca mine. The Concentrator Modeling Center at the Coleraine Minerals Research Laboratory, University of Minnesota Duluth, enhanced the capabilities of available software, Usim Pac, by developing mathematical models needed for accurate simulation of taconite plants. This project provided funding for this technology to prove itself in the industrial environment. As the first step, data representing existing plant conditions were collected by sampling and sample analysis. Data were then balanced and provided a basis for assessing the efficiency ofmore » individual devices and the plant, and also for performing simulations aimed at improving plant efficiency. Performance evaluation served as a guide in developing alternative process strategies for more efficient production. A large number of computer simulations were then performed to quantify the benefits and effects of implementing these alternative schemes. Modification of makeup ball size was selected as the most feasible option for the target performance improvement. This was combined with replacement of existing hydrocyclones with more efficient ones. After plant implementation of these modifications, plant sampling surveys were carried out to validate findings of the simulation-based study. Plant data showed very good agreement with the simulated data, confirming results of simulation. After the implementation of modifications in the plant, several upstream bottlenecks became visible. Despite these bottlenecks limiting full capacity, concentrator energy improvement of 7% was obtained. Further improvements in energy efficiency are expected in the near future. The success of this project demonstrated the feasibility of a simulation-based approach. Currently, the Center provides simulation-based service to all the iron ore mining companies operating in northern Minnesota, and future proposals are pending with non-taconite mineral processing applications.« less
Soranno, Andrea; Holla, Andrea; Dingfelder, Fabian; Nettels, Daniel; Makarov, Dmitrii E.; Schuler, Benjamin
2017-01-01
Internal friction is an important contribution to protein dynamics at all stages along the folding reaction. Even in unfolded and intrinsically disordered proteins, internal friction has a large influence, as demonstrated with several experimental techniques and in simulations. However, these methods probe different facets of internal friction and have been applied to disparate molecular systems, raising questions regarding the compatibility of the results. To obtain an integrated view, we apply here the combination of two complementary experimental techniques, simulations, and theory to the same system: unfolded protein L. We use single-molecule Förster resonance energy transfer (FRET) to measure the global reconfiguration dynamics of the chain, and photoinduced electron transfer (PET), a contact-based method, to quantify the rate of loop formation between two residues. This combination enables us to probe unfolded-state dynamics on different length scales, corresponding to different parts of the intramolecular distance distribution. Both FRET and PET measurements show that internal friction dominates unfolded-state dynamics at low denaturant concentration, and the results are in remarkable agreement with recent large-scale molecular dynamics simulations using a new water model. The simulations indicate that intrachain interactions and dihedral angle rotation correlate with the presence of internal friction, and theoretical models of polymer dynamics provide a framework for interrelating the contribution of internal friction observed in the two types of experiments and in the simulations. The combined results thus provide a coherent and quantitative picture of internal friction in unfolded proteins that could not be attained from the individual techniques. PMID:28223518
Soranno, Andrea; Holla, Andrea; Dingfelder, Fabian; Nettels, Daniel; Makarov, Dmitrii E; Schuler, Benjamin
2017-03-07
Internal friction is an important contribution to protein dynamics at all stages along the folding reaction. Even in unfolded and intrinsically disordered proteins, internal friction has a large influence, as demonstrated with several experimental techniques and in simulations. However, these methods probe different facets of internal friction and have been applied to disparate molecular systems, raising questions regarding the compatibility of the results. To obtain an integrated view, we apply here the combination of two complementary experimental techniques, simulations, and theory to the same system: unfolded protein L. We use single-molecule Förster resonance energy transfer (FRET) to measure the global reconfiguration dynamics of the chain, and photoinduced electron transfer (PET), a contact-based method, to quantify the rate of loop formation between two residues. This combination enables us to probe unfolded-state dynamics on different length scales, corresponding to different parts of the intramolecular distance distribution. Both FRET and PET measurements show that internal friction dominates unfolded-state dynamics at low denaturant concentration, and the results are in remarkable agreement with recent large-scale molecular dynamics simulations using a new water model. The simulations indicate that intrachain interactions and dihedral angle rotation correlate with the presence of internal friction, and theoretical models of polymer dynamics provide a framework for interrelating the contribution of internal friction observed in the two types of experiments and in the simulations. The combined results thus provide a coherent and quantitative picture of internal friction in unfolded proteins that could not be attained from the individual techniques.
SolarTherm: A flexible Modelica-based simulator for CSP systems
NASA Astrophysics Data System (ADS)
Scott, Paul; Alonso, Alberto de la Calle; Hinkley, James T.; Pye, John
2017-06-01
Annual performance simulations provide a valuable tool for analysing the viability and overall impact of different concentrating solar power (CSP) component and system designs. However, existing tools work best with conventional systems and are difficult or impossible to adapt when novel components, configurations and operating strategies are of interest. SolarTherm is a new open source simulation tool that fulfils this need for the solar community. It includes a simulation framework and a library of flexible CSP components and control strategies that can be adapted or replaced with new designs to meet the special needs of end users. This paper provides an introduction to SolarTherm and a comparison of models for an energy-based trough system and a physical tower system to those in the well-established and widely-used simulator SAM. Differences were found in some components where the inner workings of SAM are undocumented or not well understood, while the other parts show strong agreement. These results help to validate the fundamentals of SolarTherm and demonstrate that, while at an early stage of development, it is already a useful tool for performing annual simulations.
A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James
2011-11-01
Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less
A Laboratory Glass-Cockpit Flight Simulator for Automation and Communications Research
NASA Technical Reports Server (NTRS)
Pisanich, Gregory M.; Heers, Susan T.; Shafto, Michael G. (Technical Monitor)
1995-01-01
A laboratory glass-cockpit flight simulator supporting research on advanced commercial flight deck and Air Traffic Control (ATC) automation and communication interfaces has been developed at the Aviation Operations Branch at the NASA Ames Research Center. This system provides independent and integrated flight and ATC simulator stations, party line voice and datalink communications, along with video and audio monitoring and recording capabilities. Over the last several years, it has been used to support the investigation of flight human factors research issues involving: communication modality; message content and length; graphical versus textual presentation of information, and human accountability for automation. This paper updates the status of this simulator, describing new functionality in the areas of flight management system, EICAS display, and electronic checklist integration. It also provides an overview of several experiments performed using this simulator, including their application areas and results. Finally future enhancements to its ATC (integration of CTAS software) and flight deck (full crew operations) functionality are described.
NASA Astrophysics Data System (ADS)
Wang, XiaoLiang; Li, JiaChun
2017-12-01
A new solver based on the high-resolution scheme with novel treatments of source terms and interface capture for the Savage-Hutter model is developed to simulate granular avalanche flows. The capability to simulate flow spread and deposit processes is verified through indoor experiments of a two-dimensional granular avalanche. Parameter studies show that reduction in bed friction enhances runout efficiency, and that lower earth pressure restraints enlarge the deposit spread. The April 9, 2000, Yigong avalanche in Tibet, China, is simulated as a case study by this new solver. The predicted results, including evolution process, deposit spread, and hazard impacts, generally agree with site observations. It is concluded that the new solver for the Savage-Hutter equation provides a comprehensive software platform for granular avalanche simulation at both experimental and field scales. In particular, the solver can be a valuable tool for providing necessary information for hazard forecasts, disaster mitigation, and countermeasure decisions in mountainous areas.
A 3D virtual reality simulator for training of minimally invasive surgery.
Mi, Shao-Hua; Hou, Zeng-Gunag; Yang, Fan; Xie, Xiao-Liang; Bian, Gui-Bin
2014-01-01
For the last decade, remarkable progress has been made in the field of cardiovascular disease treatment. However, these complex medical procedures require a combination of rich experience and technical skills. In this paper, a 3D virtual reality simulator for core skills training in minimally invasive surgery is presented. The system can generate realistic 3D vascular models segmented from patient datasets, including a beating heart, and provide a real-time computation of force and force feedback module for surgical simulation. Instruments, such as a catheter or guide wire, are represented by a multi-body mass-spring model. In addition, a realistic user interface with multiple windows and real-time 3D views are developed. Moreover, the simulator is also provided with a human-machine interaction module that gives doctors the sense of touch during the surgery training, enables them to control the motion of a virtual catheter/guide wire inside a complex vascular model. Experimental results show that the simulator is suitable for minimally invasive surgery training.
Comparing Molecular Dynamics Force Fields in the Essential Subspace
Gomez-Puertas, Paulino; Boomsma, Wouter; Lindorff-Larsen, Kresten
2015-01-01
The continued development and utility of molecular dynamics simulations requires improvements in both the physical models used (force fields) and in our ability to sample the Boltzmann distribution of these models. Recent developments in both areas have made available multi-microsecond simulations of two proteins, ubiquitin and Protein G, using a number of different force fields. Although these force fields mostly share a common mathematical form, they differ in their parameters and in the philosophy by which these were derived, and previous analyses showed varying levels of agreement with experimental NMR data. To complement the comparison to experiments, we have performed a structural analysis of and comparison between these simulations, thereby providing insight into the relationship between force-field parameterization, the resulting ensemble of conformations and the agreement with experiments. In particular, our results show that, at a coarse level, many of the motional properties are preserved across several, though not all, force fields. At a finer level of detail, however, there are distinct differences in both the structure and dynamics of the two proteins, which can, together with comparison with experimental data, help to select force fields for simulations of proteins. A noteworthy observation is that force fields that have been reparameterized and improved to provide a more accurate energetic description of the balance between helical and coil structures are difficult to distinguish from their “unbalanced” counterparts in these simulations. This observation implies that simulations of stable, folded proteins, even those reaching 10 microseconds in length, may provide relatively little information that can be used to modify torsion parameters to achieve an accurate balance between different secondary structural elements. PMID:25811178
Robustness and Uncertainty: Applications for Policy in Climate and Hydrological Modeling
NASA Astrophysics Data System (ADS)
Fields, A. L., III
2015-12-01
Policymakers must often decide how to proceed when presented with conflicting simulation data from hydrological, climatological, and geological models. While laboratory sciences often appeal to the reproducibility of results to argue for the validity of their conclusions, simulations cannot use this strategy for a number of pragmatic and methodological reasons. However, robustness of predictions and causal structures can serve the same function for simulations as reproducibility does for laboratory experiments and field observations in either adjudicating between conflicting results or showing that there is insufficient justification to externally validate the results. Additionally, an interpretation of the argument from robustness is presented that involves appealing to the convergence of many well-built and diverse models rather than the more common version which involves appealing to the probability that one of a set of models is likely to be true. This interpretation strengthens the case for taking robustness as an additional requirement for the validation of simulation results and ultimately supports the idea that computer simulations can provide information about the world that is just as trustworthy as data from more traditional laboratory studies and field observations. Understanding the importance of robust results for the validation of simulation data is especially important for policymakers making decisions on the basis of potentially conflicting models. Applications will span climate, hydrological, and hydroclimatological models.
NASA Technical Reports Server (NTRS)
1978-01-01
A hybrid-computer simulation of the over the wing turbofan engine was constructed to develop the dynamic design of the control. This engine and control system includes a full authority digital electronic control using compressor stator reset to achieve fast thrust response and a modified Kalman filter to correct for sensor failures. Fast thrust response for powered-lift operations and accurate, fast responding, steady state control of the engine is provided. Simulation results for throttle bursts from 62 to 100 percent takeoff thrust predict that the engine will accelerate from 62 to 95 percent takeoff thrust in one second.
Implementation of quantum game theory simulations using Python
NASA Astrophysics Data System (ADS)
Madrid S., A.
2013-05-01
This paper provides some examples about quantum games simulated in Python's programming language. The quantum games have been developed with the Sympy Python library, which permits solving quantum problems in a symbolic form. The application of these methods of quantum mechanics to game theory gives us more possibility to achieve results not possible before. To illustrate the results of these methods, in particular, there have been simulated the quantum battle of the sexes, the prisoner's dilemma and card games. These solutions are able to exceed the classic bottle neck and obtain optimal quantum strategies. In this form, python demonstrated that is possible to do more advanced and complicated quantum games algorithms.
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.; ...
2016-09-18
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Settgast, Randolph R.; Fu, Pengcheng; Walsh, Stuart D. C.
This study describes a fully coupled finite element/finite volume approach for simulating field-scale hydraulically driven fractures in three dimensions, using massively parallel computing platforms. The proposed method is capable of capturing realistic representations of local heterogeneities, layering and natural fracture networks in a reservoir. A detailed description of the numerical implementation is provided, along with numerical studies comparing the model with both analytical solutions and experimental results. The results demonstrate the effectiveness of the proposed method for modeling large-scale problems involving hydraulically driven fractures in three dimensions.
Accuracy of Monte Carlo simulations compared to in-vivo MDCT dosimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bostani, Maryam, E-mail: mbostani@mednet.ucla.edu; McMillan, Kyle; Cagnon, Chris H.
Purpose: The purpose of this study was to assess the accuracy of a Monte Carlo simulation-based method for estimating radiation dose from multidetector computed tomography (MDCT) by comparing simulated doses in ten patients to in-vivo dose measurements. Methods: MD Anderson Cancer Center Institutional Review Board approved the acquisition of in-vivo rectal dose measurements in a pilot study of ten patients undergoing virtual colonoscopy. The dose measurements were obtained by affixing TLD capsules to the inner lumen of rectal catheters. Voxelized patient models were generated from the MDCT images of the ten patients, and the dose to the TLD for allmore » exposures was estimated using Monte Carlo based simulations. The Monte Carlo simulation results were compared to the in-vivo dose measurements to determine accuracy. Results: The calculated mean percent difference between TLD measurements and Monte Carlo simulations was −4.9% with standard deviation of 8.7% and a range of −22.7% to 5.7%. Conclusions: The results of this study demonstrate very good agreement between simulated and measured doses in-vivo. Taken together with previous validation efforts, this work demonstrates that the Monte Carlo simulation methods can provide accurate estimates of radiation dose in patients undergoing CT examinations.« less
Simulation debriefing based on principles of transfer of learning: A pilot study.
Johnston, Sandra; Coyer, Fiona; Nash, Robyn
2017-09-01
Upon completion of undergraduate nursing courses, new graduates are expected to transition seamlessly into practice. Education providers face challenges in the preparation of undergraduate nurses due to increasing student numbers and decreasing availability of clinical placement sites. High fidelity patient simulation is an integral component of nursing curricula as an adjunct to preparation for clinical placement. Debriefing after simulation is an area where the underlying structure of problems can consciously be explored. When central principles of problems are identified, they can then be used in situations that differ from the simulation experience. Third year undergraduate nursing students participated in a pilot study conducted to test a debriefing intervention where the intervention group (n=7) participated in a simulation, followed by a debriefing based on transfer of learning principles. The control group (n=5) participated in a simulation of the same scenario, followed by a standard debriefing. Students then attended focus group interviews. The results of this pilot test provided preliminary information that the debriefing approach based on transfer of learning principles may be a useful way for student nurses to learn from a simulated experience and consider the application of learning to future clinical encounters. Copyright © 2017 Elsevier Ltd. All rights reserved.
Navier-Stokes Simulation of a Heavy Lift Slowed-Rotor Compound Helicopter Configuration
NASA Technical Reports Server (NTRS)
Allan, Brian G.; Jenkins, Luther N.; Yao, Chung-Sheng; Bartram, Scott M.; Hallissy, Jim B.; Harris, Jerome; Noonan, Kevin W.; Wong, Oliver D.; Jones, Henry E.; Malovrh, Brendon D.;
2009-01-01
Time accurate numerical simulations were performed using the Reynolds-averaged Navier-Stokes (RANS) flow solver OVERFLOW for a heavy lift, slowed-rotor, compound helicopter configuration, tested at the NASA Langley 14- by 22-Foot Subsonic Tunnel. The primary purpose of these simulations is to provide support for the development of a large field of view Particle Imaging Velocimetry (PIV) flow measurement technique supported by the Subsonic Rotary Wing (SRW) project under the NASA Fundamental Aeronautics program. These simulations provide a better understanding of the rotor and body wake flows and helped to define PIV measurement locations as well as requirements for validation of flow solver codes. The large field PIV system can measure the three-dimensional velocity flow field in a 0.914m by 1.83m plane. PIV measurements were performed upstream and downstream of the vertical tail section and are compared to simulation results. The simulations are also used to better understand the tunnel wall and body/rotor support effects by comparing simulations with and without tunnel floor/ceiling walls and supports. Comparisons are also made to the experimental force and moment data for the body and rotor.
NASA Astrophysics Data System (ADS)
Liu, C. M.
2017-12-01
Wave properties predicted by the rigid-lid and the free-surface Boussinesq equations for a two-fluid system are theoretically calculated and compared in this study. Boussinesq model is generally applied to numerically simulate surface waves in coastal regions to provide credible information for disaster prevention and breaker design. As for internal waves, Liu et al. (2008) and Liu (2016) respectively derived a free-surface model and a rigid-lid Boussinesq models for a two-fluid system. The former and the latter models respectively contain four and three key variables which may result in different results and efficiency while simulating. Therefore, present study shows the results theoretically measured by these two models to provide more detailed observation and useful information for motions of internal waves.
Liu, Yanhui; Zhang, Peihua
2016-09-01
This paper presents a study of the compression behaviors of fully covered biodegradable polydioxanone biliary stents (FCBPBs) developed for human body by finite element method. To investigate the relationship between the compression force and structure parameter (monofilament diameter and braid-pin number), nine numerical models based on actual biliary stent were established, the simulation and experimental results are in good agreement with each other when calculating the compression force derived from both experiment and simulation results, indicating that the simulation results can be provided a useful reference to the investigation of biliary stents. The stress distribution on FCBPBSs was studied to optimize the structure of FCBPBSs. In addition, the plastic dissipation analysis and plastic strain of FCBPBSs were obtained via the compression simulation, revealing the structure parameter effect on the tolerance. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yan, Xuewei; Wang, Run'nan; Xu, Qingyan; Liu, Baicheng
2017-04-01
Mathematical models for dynamic heat radiation and convection boundary in directional solidification processes are established to simulate the temperature fields. Cellular automaton (CA) method and Kurz-Giovanola-Trivedi (KGT) growth model are used to describe nucleation and growth. Primary dendritic arm spacing (PDAS) and secondary dendritic arm spacing (SDAS) are calculated by the Ma-Sham (MS) and Furer-Wunderlin (FW) models respectively. The mushy zone shape is investigated based on the temperature fields, for both high-rate solidification (HRS) and liquid metal cooling (LMC) processes. The evolution of the microstructure and crystallographic orientation are analyzed by simulation and electron back-scattered diffraction (EBSD) technique, respectively. Comparison of the simulation results from PDAS and SDAS with experimental results reveals a good agreement with each other. The results show that LMC process can provide both dendritic refinement and superior performance for castings due to the increased cooling rate and thermal gradient.
Neoproterozoic 'snowball Earth' simulations with a coupled climate/ice-sheet model.
Hyde, W T; Crowley, T J; Baum, S K; Peltier, W R
2000-05-25
Ice sheets may have reached the Equator in the late Proterozoic era (600-800 Myr ago), according to geological and palaeomagnetic studies, possibly resulting in a 'snowball Earth'. But this period was a critical time in the evolution of multicellular animals, posing the question of how early life survived under such environmental stress. Here we present computer simulations of this unusual climate stage with a coupled climate/ice-sheet model. To simulate a snowball Earth, we use only a reduction in the solar constant compared to present-day conditions and we keep atmospheric CO2 concentrations near present levels. We find rapid transitions into and out of full glaciation that are consistent with the geological evidence. When we combine these results with a general circulation model, some of the simulations result in an equatorial belt of open water that may have provided a refugium for multicellular animals.
Grummer, Jared A; Bryson, Robert W; Reeder, Tod W
2014-03-01
Current molecular methods of species delimitation are limited by the types of species delimitation models and scenarios that can be tested. Bayes factors allow for more flexibility in testing non-nested species delimitation models and hypotheses of individual assignment to alternative lineages. Here, we examined the efficacy of Bayes factors in delimiting species through simulations and empirical data from the Sceloporus scalaris species group. Marginal-likelihood scores of competing species delimitation models, from which Bayes factor values were compared, were estimated with four different methods: harmonic mean estimation (HME), smoothed harmonic mean estimation (sHME), path-sampling/thermodynamic integration (PS), and stepping-stone (SS) analysis. We also performed model selection using a posterior simulation-based analog of the Akaike information criterion through Markov chain Monte Carlo analysis (AICM). Bayes factor species delimitation results from the empirical data were then compared with results from the reversible-jump MCMC (rjMCMC) coalescent-based species delimitation method Bayesian Phylogenetics and Phylogeography (BP&P). Simulation results show that HME and sHME perform poorly compared with PS and SS marginal-likelihood estimators when identifying the true species delimitation model. Furthermore, Bayes factor delimitation (BFD) of species showed improved performance when species limits are tested by reassigning individuals between species, as opposed to either lumping or splitting lineages. In the empirical data, BFD through PS and SS analyses, as well as the rjMCMC method, each provide support for the recognition of all scalaris group taxa as independent evolutionary lineages. Bayes factor species delimitation and BP&P also support the recognition of three previously undescribed lineages. In both simulated and empirical data sets, harmonic and smoothed harmonic mean marginal-likelihood estimators provided much higher marginal-likelihood estimates than PS and SS estimators. The AICM displayed poor repeatability in both simulated and empirical data sets, and produced inconsistent model rankings across replicate runs with the empirical data. Our results suggest that species delimitation through the use of Bayes factors with marginal-likelihood estimates via PS or SS analyses provide a useful and complementary alternative to existing species delimitation methods.
2007 Lunar Regolith Simulant Workshop Overview
NASA Technical Reports Server (NTRS)
McLemore, Carole A.; Fikes, John C.; Howell, Joe T.
2007-01-01
The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.
2007 Lunar Regolith Simulant Workshop Overview
NASA Technical Reports Server (NTRS)
McLemore, Carole A.; Fikes, John C.; Howell, Joe T.
2007-01-01
The National Aeronautics and Space Administration (NASA) vision has as a cornerstone, the establishment of an Outpost on the Moon. This Lunar Outpost will eventually provide the necessary planning, technology development, and training for a manned mission to Mars in the future. As part of the overall activity, NASA is conducting Earth-based research and advancing technologies to a Technology Readiness Level (TRL) 6 maturity under the Exploration Technology Development Program that will be incorporated into the Constellation Project as well as other projects. All aspects of the Lunar environment, including the Lunar regolith and its properties, are important in understanding the long-term impacts to hardware, scientific instruments, and humans prior to returning to the Moon and living on the Moon. With the goal of reducing risk to humans and hardware and increasing mission success on the Lunar surface, it is vital that terrestrial investigations including both development and verification testing have access to Lunar-like environments. The Marshall Space Flight Center (MSFC) is supporting this endeavor by developing, characterizing, and producing Lunar simulants in addition to analyzing existing simulants for appropriate applications. A Lunar Regolith Simulant Workshop was conducted by MSFC in Huntsville, Alabama, in October 2007. The purpose of the Workshop was to bring together simulant developers, simulant users, and program and project managers from ETDP and Constellation with the goals of understanding users' simulant needs and their applications. A status of current simulant developments such as the JSC-1A (Mare Type Simulant) and the NASA/U.S. Geological Survey Lunar Highlands-Type Pilot Simulant (NU-LHT-1 M) was provided. The method for evaluating simulants, performed via Figures of Merit (FoMs) algorithms, was presented and a demonstration was provided. The four FoM properties currently being assessed are: size, shape, density, and composition. Some of the Workshop findings include: simulant developers must understand simulant users' needs and applications; higher fidelity simulants are needed and needed in larger quantities now; simulants must be characterized to allow "apples-to-apples" comparison of test results; simulant users should confer with simulant experts to assist them in the selection of simulants; safety precautions should be taken in the handling and use of simulants; shipping, storing, and preparation of simulants have important implications; and most importantly, close communications among the simulant community must be maintained and will be continued via telecoms, meetings, and an annual Lunar Regolith Simulant Workshop.
Simulation of a weather radar display for over-water airborne radar approaches
NASA Technical Reports Server (NTRS)
Clary, G. R.
1983-01-01
Airborne radar approach (ARA) concepts are being investigated as a part of NASA's Rotorcraft All-Weather Operations Research Program on advanced guidance and navigation methods. This research is being conducted using both piloted simulations and flight test evaluations. For the piloted simulations, a mathematical model of the airborne radar was developed for over-water ARAs to offshore platforms. This simulated flight scenario requires radar simulation of point targets, such as oil rigs and ships, distributed sea clutter, and transponder beacon replies. Radar theory, weather radar characteristics, and empirical data derived from in-flight radar photographs are combined to model a civil weather/mapping radar typical of those used in offshore rotorcraft operations. The resulting radar simulation is realistic and provides the needed simulation capability for ongoing ARA research.
Multiagent intelligent systems
NASA Astrophysics Data System (ADS)
Krause, Lee S.; Dean, Christopher; Lehman, Lynn A.
2003-09-01
This paper will discuss a simulation approach based upon a family of agent-based models. As the demands placed upon simulation technology by such applications as Effects Based Operations (EBO), evaluations of indicators and warnings surrounding homeland defense and commercial demands such financial risk management current single thread based simulations will continue to show serious deficiencies. The types of "what if" analysis required to support these types of applications, demand rapidly re-configurable approaches capable of aggregating large models incorporating multiple viewpoints. The use of agent technology promises to provide a broad spectrum of models incorporating differing viewpoints through a synthesis of a collection of models. Each model would provide estimates to the overall scenario based upon their particular measure or aspect. An agent framework, denoted as the "family" would provide a common ontology in support of differing aspects of the scenario. This approach permits the future of modeling to change from viewing the problem as a single thread simulation, to take into account multiple viewpoints from different models. Even as models are updated or replaced the agent approach permits rapid inclusion in new or modified simulations. In this approach a variety of low and high-resolution information and its synthesis requires a family of models. Each agent "publishes" its support for a given measure and each model provides their own estimates on the scenario based upon their particular measure or aspect. If more than one agent provides the same measure (e.g. cognitive) then the results from these agents are combined to form an aggregate measure response. The objective would be to inform and help calibrate a qualitative model, rather than merely to present highly aggregated statistical information. As each result is processed, the next action can then be determined. This is done by a top-level decision system that communicates to the family at the ontology level without any specific understanding of the processes (or model) behind each agent. The increasingly complex demands upon simulation for the necessity to incorporate the breadth and depth of influencing factors makes a family of agent based models a promising solution. This paper will discuss that solution with syntax and semantics necessary to support the approach.
End-to-End QoS for Differentiated Services and ATM Internetworking
NASA Technical Reports Server (NTRS)
Su, Hongjun; Atiquzzaman, Mohammed
2001-01-01
The Internet was initially design for non real-time data communications and hence does not provide any Quality of Service (QoS). The next generation Internet will be characterized by high speed and QoS guarantee. The aim of this paper is to develop a prioritized early packet discard (PEPD) scheme for ATM switches to provide service differentiation and QoS guarantee to end applications running over next generation Internet. The proposed PEPD scheme differs from previous schemes by taking into account the priority of packets generated from different application. We develop a Markov chain model for the proposed scheme and verify the model with simulation. Numerical results show that the results from the model and computer simulation are in close agreement. Our PEPD scheme provides service differentiation to the end-to-end applications.
NASA Astrophysics Data System (ADS)
Liu, Dongdong; She, Dongli
2018-06-01
Current physically based erosion models do not carefully consider the dynamic variations of soil properties during rainfall and are unable to simulate saline-sodic soil slope erosion processes. The aim of this work was to build upon a complete model framework, SSEM, to simulate runoff and erosion processes for saline-sodic soils by coupling dynamic saturated hydraulic conductivity Ks and soil erodibility Kτ. Sixty rainfall simulation rainfall experiments (2 soil textures × 5 sodicity levels × 2 slope gradients × 3 duplicates) provided data for model calibration and validation. SSEM worked very well for simulating the runoff and erosion processes of saline-sodic silty clay. The runoff and erosion processes of saline-sodic silt loam were more complex than those of non-saline soils or soils with higher clay contents; thus, SSEM did not perform very well for some validation events. We further examined the model performances of four concepts: Dynamic Ks and Kτ (Case 1, SSEM), Dynamic Ks and Constant Kτ (Case 2), Constant Ks and Dynamic Kτ (Case 3) and Constant Ks and Constant Kτ (Case 4). The results demonstrated that the model, which considers dynamic variations in soil saturated hydraulic conductivity and soil erodibility, can provide more reasonable runoff and erosion prediction results for saline-sodic soils.
Systematic analysis of signaling pathways using an integrative environment.
Visvanathan, Mahesh; Breit, Marc; Pfeifer, Bernhard; Baumgartner, Christian; Modre-Osprian, Robert; Tilg, Bernhard
2007-01-01
Understanding the biological processes of signaling pathways as a whole system requires an integrative software environment that has comprehensive capabilities. The environment should include tools for pathway design, visualization, simulation and a knowledge base concerning signaling pathways as one. In this paper we introduce a new integrative environment for the systematic analysis of signaling pathways. This system includes environments for pathway design, visualization, simulation and a knowledge base that combines biological and modeling information concerning signaling pathways that provides the basic understanding of the biological system, its structure and functioning. The system is designed with a client-server architecture. It contains a pathway designing environment and a simulation environment as upper layers with a relational knowledge base as the underlying layer. The TNFa-mediated NF-kB signal trans-duction pathway model was designed and tested using our integrative framework. It was also useful to define the structure of the knowledge base. Sensitivity analysis of this specific pathway was performed providing simulation data. Then the model was extended showing promising initial results. The proposed system offers a holistic view of pathways containing biological and modeling data. It will help us to perform biological interpretation of the simulation results and thus contribute to a better understanding of the biological system for drug identification.
Accuracy of buffered-force QM/MM simulations of silica
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peguiron, Anke; Moras, Gianpietro; Colombi Ciacchi, Lucio
2015-02-14
We report comparisons between energy-based quantum mechanics/molecular mechanics (QM/MM) and buffered force-based QM/MM simulations in silica. Local quantities—such as density of states, charges, forces, and geometries—calculated with both QM/MM approaches are compared to the results of full QM simulations. We find the length scale over which forces computed using a finite QM region converge to reference values obtained in full quantum-mechanical calculations is ∼10 Å rather than the ∼5 Å previously reported for covalent materials such as silicon. Electrostatic embedding of the QM region in the surrounding classical point charges gives only a minor contribution to the force convergence. Whilemore » the energy-based approach provides accurate results in geometry optimizations of point defects, we find that the removal of large force errors at the QM/MM boundary provided by the buffered force-based scheme is necessary for accurate constrained geometry optimizations where Si–O bonds are elongated and for finite-temperature molecular dynamics simulations of crack propagation. Moreover, the buffered approach allows for more flexibility, since special-purpose QM/MM coupling terms that link QM and MM atoms are not required and the region that is treated at the QM level can be adaptively redefined during the course of a dynamical simulation.« less
NASA Technical Reports Server (NTRS)
1992-01-01
The purpose of QASE RT is to enable system analysts and software engineers to evaluate performance and reliability implications of design alternatives. The program resulted from two Small Business Innovation Research (SBIR) projects. After receiving a description of the system architecture and workload from the user, QASE RT translates the system description into simulation models and executes them. Simulation provides detailed performance evaluation. The results of the evaluations are service and response times, offered load and device utilizations and functional availability.
Power combining in an array of microwave power rectifiers
NASA Technical Reports Server (NTRS)
Gutmann, R. J.; Borrego, J. M.
1979-01-01
This work analyzes the resultant efficiency degradation when identical rectifiers operate at different RF power levels as caused by the power beam taper. Both a closed-form analytical circuit model and a detailed computer-simulation model are used to obtain the output dc load line of the rectifier. The efficiency degradation is nearly identical with series and parallel combining, and the closed-form analytical model provides results which are similar to the detailed computer-simulation model.
A cascading failure analysis tool for post processing TRANSCARE simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less
NASA Astrophysics Data System (ADS)
Dörr, Dominik; Joppich, Tobias; Schirmaier, Fabian; Mosthaf, Tobias; Kärger, Luise; Henning, Frank
2016-10-01
Thermoforming of continuously fiber reinforced thermoplastics (CFRTP) is ideally suited to thin walled and complex shaped products. By means of forming simulation, an initial validation of the producibility of a specific geometry, an optimization of the forming process and the prediction of fiber-reorientation due to forming is possible. Nevertheless, applied methods need to be validated. Therefor a method is presented, which enables the calculation of error measures for the mismatch between simulation results and experimental tests, based on measurements with a conventional coordinate measuring device. As a quantitative measure, describing the curvature is provided, the presented method is also suitable for numerical or experimental sensitivity studies on wrinkling behavior. The applied methods for forming simulation, implemented in Abaqus explicit, are presented and applied to a generic geometry. The same geometry is tested experimentally and simulation and test results are compared by the proposed validation method.
Simulations of 6-DOF Motion with a Cartesian Method
NASA Technical Reports Server (NTRS)
Murman, Scott M.; Aftosmis, Michael J.; Berger, Marsha J.; Kwak, Dochan (Technical Monitor)
2003-01-01
Coupled 6-DOF/CFD trajectory predictions using an automated Cartesian method are demonstrated by simulating a GBU-32/JDAM store separating from an F-18C aircraft. Numerical simulations are performed at two Mach numbers near the sonic speed, and compared with flight-test telemetry and photographic-derived data. Simulation results obtained with a sequential-static series of flow solutions are contrasted with results using a time-dependent flow solver. Both numerical methods show good agreement with the flight-test data through the first half of the simulations. The sequential-static and time-dependent methods diverge over the last half of the trajectory prediction. after the store produces peak angular rates. A cost comparison for the Cartesian method is included, in terms of absolute cost and relative to computing uncoupled 6-DOF trajectories. A detailed description of the 6-DOF method, as well as a verification of its accuracy, is provided in an appendix.
Open-loop simulations of atmospheric turbulence using the AdAPS interface
NASA Astrophysics Data System (ADS)
Widiker, Jeffrey J.; Magee, Eric P.
2005-08-01
We present and analyze experimental results of lab-based open-loop turbulence simulation utilizing the Adaptive Aberrating Phase Screen Interface developed by ATK Mission Research, which incorporates a 2-D spatial light modulator manufactured by Boulder Nonlinear Systems. These simulations demonstrate the effectiveness of a SLM to simulate various atmospheric turbulence scenarios in a laboratory setting without altering the optical setup. This effectiveness is shown using several figures of merit including: long-term Strehl ratio, time-dependant mean-tilt analysis, and beam break-up geometry. The scenarios examined here range from relatively weak (D/ro = 0.167) to quite strong (D/ro = 10) turbulence effects modeled using a single phase-screen placed at the pupil of a Fourier Transforming lens. While very strong turbulence scenarios result long-term Strehl ratios higher than expected, the SLM provided an accurate simulation of atmospheric effects for conventional phase-screen strengths.
Monte Carlo simulation of Ray-Scan 64 PET system and performance evaluation using GATE toolkit
NASA Astrophysics Data System (ADS)
Li, Suying; Zhang, Qiushi; Vuletic, Ivan; Xie, Zhaoheng; Yang, Kun; Ren, Qiushi
2017-02-01
In this study, we aimed to develop a GATE model for the simulation of Ray-Scan 64 PET scanner and model its performance characteristics. A detailed implementation of system geometry and physical process were included in the simulation model. Then we modeled the performance characteristics of Ray-Scan 64 PET system for the first time, based on National Electrical Manufacturers Association (NEMA) NU-2 2007 protocols and validated the model against experimental measurement, including spatial resolution, sensitivity, counting rates and noise equivalent count rate (NECR). Moreover, an accurate dead time module was investigated to simulate the counting rate performance. Overall results showed reasonable agreement between simulation and experimental data. The validation results showed the reliability and feasibility of the GATE model to evaluate major performance of Ray-Scan 64 PET system. It provided a useful tool for a wide range of research applications.
Scale-Similar Models for Large-Eddy Simulations
NASA Technical Reports Server (NTRS)
Sarghini, F.
1999-01-01
Scale-similar models employ multiple filtering operations to identify the smallest resolved scales, which have been shown to be the most active in the interaction with the unresolved subgrid scales. They do not assume that the principal axes of the strain-rate tensor are aligned with those of the subgrid-scale stress (SGS) tensor, and allow the explicit calculation of the SGS energy. They can provide backscatter in a numerically stable and physically realistic manner, and predict SGS stresses in regions that are well correlated with the locations where large Reynolds stress occurs. In this paper, eddy viscosity and mixed models, which include an eddy-viscosity part as well as a scale-similar contribution, are applied to the simulation of two flows, a high Reynolds number plane channel flow, and a three-dimensional, nonequilibrium flow. The results show that simulations without models or with the Smagorinsky model are unable to predict nonequilibrium effects. Dynamic models provide an improvement of the results: the adjustment of the coefficient results in more accurate prediction of the perturbation from equilibrium. The Lagrangian-ensemble approach [Meneveau et al., J. Fluid Mech. 319, 353 (1996)] is found to be very beneficial. Models that included a scale-similar term and a dissipative one, as well as the Lagrangian ensemble averaging, gave results in the best agreement with the direct simulation and experimental data.
ERIC Educational Resources Information Center
Fouladi, Rachel T.
2000-01-01
Provides an overview of standard and modified normal theory and asymptotically distribution-free covariance and correlation structure analysis techniques and details Monte Carlo simulation results on Type I and Type II error control. Demonstrates through the simulation that robustness and nonrobustness of structure analysis techniques vary as a…
Are They Bloody Guilty? Blood Doping with Simulated Samples
ERIC Educational Resources Information Center
Stuart, Parker E.; Lees, Kelsey D.; Milanick, Mark A.
2014-01-01
In this practice-based lab, students are provided with four Olympic athlete profiles and simulated blood and urine samples to test for illegal substances and blood-doping practices. Throughout the course of the lab, students design and conduct a testing procedure and use their results to determine which athletes won their medals fairly. All of the…
Definition of avionics concepts for a heavy lift cargo vehicle, appendix A
NASA Technical Reports Server (NTRS)
1989-01-01
The major objective of the study task was to define a cost effective, multiuser simulation, test, and demonstration facility to support the development of avionics systems for future space vehicles. This volume provides the results of the main simulation processor selection study and describes some proof-of-concept demonstrations for the avionics test bed facility.
Soliman, Ahmed M; Eldosoky, Mohamed A; Taha, Taha E
2017-03-29
The separation of blood components (WBCs, RBCs, and platelets) is important for medical applications. Recently, standing surface acoustic wave (SSAW) microfluidic devices are used for the separation of particles. In this paper, the design analysis of SSAW microfluidics is presented. Also, the analysis of SSAW force with Rayleigh angle effect and its attenuation in liquid-loaded substrate, viscous drag force, hydrodynamic force, and diffusion force are explained and analyzed. The analyses are provided for selecting the piezoelectric material, width of the main microchannel, working area of SAW, wavelength, minimum input power required for the separation process, and widths of outlet collecting microchannels. The design analysis of SSAW microfluidics is provided for determining the minimum input power required for the separation process with appropriated the displacement contrast of the particles.The analyses are applied for simulation the separation of blood components. The piezoelectric material, width of the main microchannel, working area of SAW, wavelength, and minimum input power required for the separation process are selected as LiNbO₃, 120 μm, 1.08 mm², 300 μm, 371 mW. The results are compared to other published results. The results of these simulations achieve minimum power consumption, less complicated setup, and high collecting efficiency. All simulation programs are built by MATLAB.
Soliman, Ahmed M.; Eldosoky, Mohamed A.; Taha, Taha E.
2017-01-01
The separation of blood components (WBCs, RBCs, and platelets) is important for medical applications. Recently, standing surface acoustic wave (SSAW) microfluidic devices are used for the separation of particles. In this paper, the design analysis of SSAW microfluidics is presented. Also, the analysis of SSAW force with Rayleigh angle effect and its attenuation in liquid-loaded substrate, viscous drag force, hydrodynamic force, and diffusion force are explained and analyzed. The analyses are provided for selecting the piezoelectric material, width of the main microchannel, working area of SAW, wavelength, minimum input power required for the separation process, and widths of outlet collecting microchannels. The design analysis of SSAW microfluidics is provided for determining the minimum input power required for the separation process with appropriated the displacement contrast of the particles.The analyses are applied for simulation the separation of blood components. The piezoelectric material, width of the main microchannel, working area of SAW, wavelength, and minimum input power required for the separation process are selected as LiNbO3, 120 μm, 1.08 mm2, 300 μm, 371 mW. The results are compared to other published results. The results of these simulations achieve minimum power consumption, less complicated setup, and high collecting efficiency. All simulation programs are built by MATLAB. PMID:28952506
Optimization of lamp arrangement in a closed-conduit UV reactor based on a genetic algorithm.
Sultan, Tipu; Ahmad, Zeshan; Cho, Jinsoo
2016-01-01
The choice for the arrangement of the UV lamps in a closed-conduit ultraviolet (CCUV) reactor significantly affects the performance. However, a systematic methodology for the optimal lamp arrangement within the chamber of the CCUV reactor is not well established in the literature. In this research work, we propose a viable systematic methodology for the lamp arrangement based on a genetic algorithm (GA). In addition, we analyze the impacts of the diameter, angle, and symmetry of the lamp arrangement on the reduction equivalent dose (RED). The results are compared based on the simulated RED values and evaluated using the computational fluid dynamics simulations software ANSYS FLUENT. The fluence rate was calculated using commercial software UVCalc3D, and the GA-based lamp arrangement optimization was achieved using MATLAB. The simulation results provide detailed information about the GA-based methodology for the lamp arrangement, the pathogen transport, and the simulated RED values. A significant increase in the RED values was achieved by using the GA-based lamp arrangement methodology. This increase in RED value was highest for the asymmetric lamp arrangement within the chamber of the CCUV reactor. These results demonstrate that the proposed GA-based methodology for symmetric and asymmetric lamp arrangement provides a viable technical solution to the design and optimization of the CCUV reactor.
Development of Aspen: A microanalytic simulation model of the US economy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pryor, R.J.; Basu, N.; Quint, T.
1996-02-01
This report describes the development of an agent-based microanalytic simulation model of the US economy. The microsimulation model capitalizes on recent technological advances in evolutionary learning and parallel computing. Results are reported for a test problem that was run using the model. The test results demonstrate the model`s ability to predict business-like cycles in an economy where prices and inventories are allowed to vary. Since most economic forecasting models have difficulty predicting any kind of cyclic behavior. These results show the potential of microanalytic simulation models to improve economic policy analysis and to provide new insights into underlying economic principles.more » Work already has begun on a more detailed model.« less
Optimization-Based Calibration of FAST.Farm Parameters Against SOWFA: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreira, Paula D; Annoni, Jennifer; Jonkman, Jason
2018-01-04
FAST.Farm is a medium-delity wind farm modeling tool that can be used to assess power and loads contributions of wind turbines in a wind farm. The objective of this paper is to undertake a calibration procedure to set the user parameters of FAST.Farm to accurately represent results from large-eddy simulations. The results provide an in- depth analysis of the comparison of FAST.Farm and large-eddy simulations before and after calibration. The comparison of FAST.Farm and large-eddy simulation results are presented with respect to streamwise and radial velocity components as well as wake-meandering statistics (mean and standard deviation) in the lateral andmore » vertical directions under different atmospheric and turbine operating conditions.« less
Feel, imagine and learn! - Haptic augmented simulation and embodied instruction in physics learning
NASA Astrophysics Data System (ADS)
Han, In Sook
The purpose of this study was to investigate the potentials and effects of an embodied instructional model in abstract concept learning. This embodied instructional process included haptic augmented educational simulation as an instructional tool to provide perceptual experiences as well as further instruction to activate those previous experiences with perceptual simulation. In order to verify the effectiveness of this instructional model, haptic augmented simulation with three different haptic levels (force and kinesthetic, kinesthetic, and non-haptic) and instructional materials (narrative and expository) were developed and their effectiveness tested. 220 fifth grade students were recruited to participate in the study from three elementary schools located in lower SES neighborhoods in Bronx, New York. The study was conducted for three consecutive weeks in regular class periods. The data was analyzed using ANCOVA, ANOVA, and MANOVA. The result indicates that haptic augmented simulations, both the force and kinesthetic and the kinesthetic simulations, was more effective than the non-haptic simulation in providing perceptual experiences and helping elementary students to create multimodal representations about machines' movements. However, in most cases, force feedback was needed to construct a fully loaded multimodal representation that could be activated when the instruction with less sensory modalities was being given. In addition, the force and kinesthetic simulation was effective in providing cognitive grounding to comprehend a new learning content based on the multimodal representation created with enhanced force feedback. Regarding the instruction type, it was found that the narrative and the expository instructions did not make any difference in activating previous perceptual experiences. These findings suggest that it is important to help students to make a solid cognitive ground with perceptual anchor. Also, sequential abstraction process would deepen students' understanding by providing an opportunity to practice their mental simulation by removing sensory modalities used one by one and to gradually reach abstract level of understanding where students can imagine the machine's movements and working mechanisms with only abstract language without any perceptual supports.
Towards a genetics-based adaptive agent to support flight testing
NASA Astrophysics Data System (ADS)
Cribbs, Henry Brown, III
Although the benefits of aircraft simulation have been known since the late 1960s, simulation almost always entails interaction with a human test pilot. This "pilot-in-the-loop" simulation process provides useful evaluative information to the aircraft designer and provides a training tool to the pilot. Emulation of a pilot during the early phases of the aircraft design process might provide designers a useful evaluative tool. Machine learning might emulate a pilot in a simulated aircraft/cockpit setting. Preliminary work in the application of machine learning techniques, such as reinforcement learning, to aircraft maneuvering have shown promise. These studies used simplified interfaces between machine learning agent and the aircraft simulation. The simulations employed low order equivalent system models. High-fidelity aircraft simulations exist, such as the simulations developed by NASA at its Dryden Flight Research Center. To expand the applicational domain of reinforcement learning to aircraft designs, this study presents a series of experiments that examine a reinforcement learning agent in the role of test pilot. The NASA X-31 and F-106 high-fidelity simulations provide realistic aircraft for the agent to maneuver. The approach of the study is to examine an agent possessing a genetic-based, artificial neural network to approximate long-term, expected cost (Bellman value) in a basic maneuvering task. The experiments evaluate different learning methods based on a common feedback function and an identical task. The learning methods evaluated are: Q-learning, Q(lambda)-learning, SARSA learning, and SARSA(lambda) learning. Experimental results indicate that, while prediction error remain quite high, similar, repeatable behaviors occur in both aircraft. Similar behavior exhibits portability of the agent between aircraft with different handling qualities (dynamics). Besides the adaptive behavior aspects of the study, the genetic algorithm used in the agent is shown to play an additive role in the shaping of the artificial neural network to the prediction task.
The Validity of Quasi-Steady-State Approximations in Discrete Stochastic Simulations
Kim, Jae Kyoung; Josić, Krešimir; Bennett, Matthew R.
2014-01-01
In biochemical networks, reactions often occur on disparate timescales and can be characterized as either fast or slow. The quasi-steady-state approximation (QSSA) utilizes timescale separation to project models of biochemical networks onto lower-dimensional slow manifolds. As a result, fast elementary reactions are not modeled explicitly, and their effect is captured by nonelementary reaction-rate functions (e.g., Hill functions). The accuracy of the QSSA applied to deterministic systems depends on how well timescales are separated. Recently, it has been proposed to use the nonelementary rate functions obtained via the deterministic QSSA to define propensity functions in stochastic simulations of biochemical networks. In this approach, termed the stochastic QSSA, fast reactions that are part of nonelementary reactions are not simulated, greatly reducing computation time. However, it is unclear when the stochastic QSSA provides an accurate approximation of the original stochastic simulation. We show that, unlike the deterministic QSSA, the validity of the stochastic QSSA does not follow from timescale separation alone, but also depends on the sensitivity of the nonelementary reaction rate functions to changes in the slow species. The stochastic QSSA becomes more accurate when this sensitivity is small. Different types of QSSAs result in nonelementary functions with different sensitivities, and the total QSSA results in less sensitive functions than the standard or the prefactor QSSA. We prove that, as a result, the stochastic QSSA becomes more accurate when nonelementary reaction functions are obtained using the total QSSA. Our work provides an apparently novel condition for the validity of the QSSA in stochastic simulations of biochemical reaction networks with disparate timescales. PMID:25099817
ViSBARD: Visual System for Browsing, Analysis and Retrieval of Data
NASA Astrophysics Data System (ADS)
Roberts, D. Aaron; Boller, Ryan; Rezapkin, V.; Coleman, J.; McGuire, R.; Goldstein, M.; Kalb, V.; Kulkarni, R.; Luckyanova, M.; Byrnes, J.; Kerbel, U.; Candey, R.; Holmes, C.; Chimiak, R.; Harris, B.
2018-04-01
ViSBARD interactively visualizes and analyzes space physics data. It provides an interactive integrated 3-D and 2-D environment to determine correlations between measurements across many spacecraft. It supports a variety of spacecraft data products and MHD models and is easily extensible to others. ViSBARD provides a way of visualizing multiple vector and scalar quantities as measured by many spacecraft at once. The data are displayed three-dimesionally along the orbits which may be displayed either as connected lines or as points. The data display allows the rapid determination of vector configurations, correlations between many measurements at multiple points, and global relationships. With the addition of magnetohydrodynamic (MHD) model data, this environment can also be used to validate simulation results with observed data, use simulated data to provide a global context for sparse observed data, and apply feature detection techniques to the simulated data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, R.D.
The results of a research effort to develop a multiphase naturally fractured, lenticular reservoir simulator is presented. The simulator possesses the capability of investigating the effects of non-Darcy flow, Klinkenberg effect, and transient multiphase wellbore storage for wells with finite and infinite conductivity fractures. The simulator has been utilized to simulate actual pressure transient data for gas wells associated with the United States Department of Energy, Western Gas Sands Project, MWX Experiments. The results of these simulations are contained in the report as well as simulation results for hypothetical wells which are producing under multiphase flow conditions. In addition tomore » the reservoir simulation development, and theoretical and field case studies the results of an experimental program to investigate multiphase non-Darcy flow coefficients (inertial resistance coefficients or beta factors as they are sometimes called) are also presented. The experimental data was obtained for non-Darcy flow in porous and fractured media. The results clearly indicate the dependence of the non-Darcy flow coefficient upon liquid saturation. Where appropriate comparisons are made against data available in the open literature. In addition, theoretical development of a correlation to predict non-Darcy flow coefficients as a function of effective gas permeability, liquid saturations, and porosity is presentd. The results presented in this report will provide scientists and engineers tools to investigate well performance data and production trends for wells completed in lenticular, naturally fractured formations producing under non-Darcy, multiphase conditions. 65 refs., 57 figs., 15 tabs.« less
Simulation of transient effects in the heavy ion fusion injectors
NASA Astrophysics Data System (ADS)
Chen, Yu-Jiuan; Hewett, D. W.
1993-05-01
We have used the 2-D PIC code, GYMNOS, to study the transient behaviors in the Heavy Ion Fusion (HIF) injectors. GYMNOS simulations accurately provide the steady state Child-Langmuir current and the beam transient behavior within a planar diode. The simulations of the LBL HIF ESAC injector experiments agree well with the experimental data and EGUN steady state results. Simulations of the nominal HIF injectors have revealed the need to design the accelerating electrodes carefully to control the ion beam current, particularly the ion loss at the end of the bunch as the extraction voltage is reduced.
Statistics of velocity gradients in two-dimensional Navier-Stokes and ocean turbulence.
Schorghofer, Norbert; Gille, Sarah T
2002-02-01
Probability density functions and conditional averages of velocity gradients derived from upper ocean observations are compared with results from forced simulations of the two-dimensional Navier-Stokes equations. Ocean data are derived from TOPEX satellite altimeter measurements. The simulations use rapid forcing on large scales, characteristic of surface winds. The probability distributions of transverse velocity derivatives from the ocean observations agree with the forced simulations, although they differ from unforced simulations reported elsewhere. The distribution and cross correlation of velocity derivatives provide clear evidence that large coherent eddies play only a minor role in generating the observed statistics.
Efficient evaluation of wireless real-time control networks.
Horvath, Peter; Yampolskiy, Mark; Koutsoukos, Xenofon
2015-02-11
In this paper, we present a system simulation framework for the design and performance evaluation of complex wireless cyber-physical systems. We describe the simulator architecture and the specific developments that are required to simulate cyber-physical systems relying on multi-channel, multihop mesh networks. We introduce realistic and efficient physical layer models and a system simulation methodology, which provides statistically significant performance evaluation results with low computational complexity. The capabilities of the proposed framework are illustrated in the example of WirelessHART, a centralized, real-time, multi-hop mesh network designed for industrial control and monitor applications.
Baum, K. G.; Menezes, G.; Helguera, M.
2011-01-01
Medical imaging system simulators are tools that provide a means to evaluate system architecture and create artificial image sets that are appropriate for specific applications. We have modified SIMRI, a Bloch equation-based magnetic resonance image simulator, in order to successfully generate high-resolution 3D MR images of the Montreal brain phantom using Blue Gene/L systems. Results show that redistribution of the workload allows an anatomically accurate 256 3 voxel spin-echo simulation in less than 5 hours when executed on an 8192-node partition of a Blue Gene/L system.
Computer simulation of space charge
NASA Astrophysics Data System (ADS)
Yu, K. W.; Chung, W. K.; Mak, S. S.
1991-05-01
Using the particle-mesh (PM) method, a one-dimensional simulation of the well-known Langmuir-Child's law is performed on an INTEL 80386-based personal computer system. The program is coded in turbo basic (trademark of Borland International, Inc.). The numerical results obtained were in excellent agreement with theoretical predictions and the computational time required is quite modest. This simulation exercise demonstrates that some simple computer simulation using particles may be implemented successfully on PC's that are available today, and hopefully this will provide the necessary incentives for newcomers to the field who wish to acquire a flavor of the elementary aspects of the practice.
NASA Technical Reports Server (NTRS)
Chung, Christopher A.; Marwaha, Shweta
2005-01-01
This paper describes an interactive multimedia simulator for air transportation bomb threat training. The objective of this project is to improve the air transportation sector s capability to respond to bomb threats received by commercial airports and aircraft. The simulator provides realistic training on receiving and responding to a variety of bomb threats that might not otherwise be possible due to time, cost, or operational constraints. Validation analysis indicates that the use of the simulator resulted in statistically significant increases in individual ability to respond to these types of bomb threats.
Baum, K G; Menezes, G; Helguera, M
2011-01-01
Medical imaging system simulators are tools that provide a means to evaluate system architecture and create artificial image sets that are appropriate for specific applications. We have modified SIMRI, a Bloch equation-based magnetic resonance image simulator, in order to successfully generate high-resolution 3D MR images of the Montreal brain phantom using Blue Gene/L systems. Results show that redistribution of the workload allows an anatomically accurate 256(3) voxel spin-echo simulation in less than 5 hours when executed on an 8192-node partition of a Blue Gene/L system.
Dissipative particle dynamics simulations of polymersomes.
Ortiz, Vanessa; Nielsen, Steven O; Discher, Dennis E; Klein, Michael L; Lipowsky, Reinhard; Shillcock, Julian
2005-09-22
A DPD model of PEO-based block copolymer vesicles in water is developed by introducing a new density based coarse graining and by using experimental data for interfacial tension. Simulated as a membrane patch, the DPD model is in excellent agreement with experimental data for both the area expansion modulus and the scaling of hydrophobic core thickness with molecular weight. Rupture simulations of polymer vesicles, or "polymersomes", are presented to illustrate the system sizes feasible with DPD. The results should provide guidance for theoretical derivations of scaling laws and also illustrate how spherical polymer vesicles might be studied in simulation.
NASA Technical Reports Server (NTRS)
Houck, J. A.
1980-01-01
This paper describes the work being done at the National Aeronautics and Space Administration's Langley Research Center on the development of a mission simulator for use in the Terminal Configured Vehicle Program. A brief description of the goals and objectives of the Terminal Configured Vehicle Program is presented. A more detailed description of the Mission Simulator, in its present configuration, and its components is provided. Finally, a description of the first research study conducted in the Mission Simulator is presented along with a discussion of some preliminary results from this study.
NASA Technical Reports Server (NTRS)
Kubat, Gregory
2016-01-01
This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.
Cosmic reionization on computers: The faint end of the galaxy luminosity function
Gnedin, Nickolay Y.
2016-07-01
Using numerical cosmological simulations completed under the “Cosmic Reionization On Computers” project, I explore theoretical predictions for the faint end of the galaxy UV luminosity functions atmore » $$z\\gtrsim 6$$. A commonly used Schechter function approximation with the magnitude cut at $${M}_{{\\rm{cut}}}\\sim -13$$ provides a reasonable fit to the actual luminosity function of simulated galaxies. When the Schechter functional form is forced on the luminosity functions from the simulations, the magnitude cut $${M}_{{\\rm{cut}}}$$ is found to vary between -12 and -14 with a mild redshift dependence. Here, an analytical model of reionization from Madau et al., as used by Robertson et al., provides a good description of the simulated results, which can be improved even further by adding two physically motivated modifications to the original Madau et al. equation.« less
Simulation of Smart Home Activity Datasets
Synnott, Jonathan; Nugent, Chris; Jeffers, Paul
2015-01-01
A globally ageing population is resulting in an increased prevalence of chronic conditions which affect older adults. Such conditions require long-term care and management to maximize quality of life, placing an increasing strain on healthcare resources. Intelligent environments such as smart homes facilitate long-term monitoring of activities in the home through the use of sensor technology. Access to sensor datasets is necessary for the development of novel activity monitoring and recognition approaches. Access to such datasets is limited due to issues such as sensor cost, availability and deployment time. The use of simulated environments and sensors may address these issues and facilitate the generation of comprehensive datasets. This paper provides a review of existing approaches for the generation of simulated smart home activity datasets, including model-based approaches and interactive approaches which implement virtual sensors, environments and avatars. The paper also provides recommendation for future work in intelligent environment simulation. PMID:26087371
Mesoscale Polymer Dissolution Probed by Raman Spectroscopy and Molecular Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Tsun-Mei; Xantheas, Sotiris S.; Vasdekis, Andreas E.
2016-10-13
The diffusion of various solvents into a polystyrene (PS) matrix was probed experimentally by monitoring the temporal profiles of the Raman spectra and theoretically from molecular dynamics (MD) simulations of the binary system. The simulation results assist in providing a fundamental, molecular level connection between the mixing/dissolution processes and the difference = solvent – PS in the values of the Hildebrand parameter () between the two components of the binary systems: solvents having similar values of with PS (small ) exhibit fast diffusion into the polymer matrix, whereas the diffusion slows down considerably when the ’s are different (large ).more » To this end, the Hildebrand parameter was identified as a useful descriptor that governs the process of mixing in polymer – solvent binary systems. The experiments also provide insight into further refinements of the models specific to non-Fickian diffusion phenomena that need to be used in the simulations.« less
Xayaphoummine, A.; Bucher, T.; Isambert, H.
2005-01-01
The Kinefold web server provides a web interface for stochastic folding simulations of nucleic acids on second to minute molecular time scales. Renaturation or co-transcriptional folding paths are simulated at the level of helix formation and dissociation in agreement with the seminal experimental results. Pseudoknots and topologically ‘entangled’ helices (i.e. knots) are efficiently predicted taking into account simple geometrical and topological constraints. To encourage interactivity, simulations launched as immediate jobs are automatically stopped after a few seconds and return adapted recommendations. Users can then choose to continue incomplete simulations using the batch queuing system or go back and modify suggested options in their initial query. Detailed output provide (i) a series of low free energy structures, (ii) an online animated folding path and (iii) a programmable trajectory plot focusing on a few helices of interest to each user. The service can be accessed at . PMID:15980546
Comprehending Sentences With the Body: Action Compatibility in British Sign Language?
Vinson, David; Perniss, Pamela; Fox, Neil; Vigliocco, Gabriella
2017-05-01
Previous studies show that reading sentences about actions leads to specific motor activity associated with actually performing those actions. We investigate how sign language input may modulate motor activation, using British Sign Language (BSL) sentences, some of which explicitly encode direction of motion, versus written English, where motion is only implied. We find no evidence of action simulation in BSL comprehension (Experiments 1-3), but we find effects of action simulation in comprehension of written English sentences by deaf native BSL signers (Experiment 4). These results provide constraints on the nature of mental simulations involved in comprehending action sentences referring to transfer events, suggesting that the richer contextual information provided by BSL sentences versus written or spoken English may reduce the need for action simulation in comprehension, at least when the event described does not map completely onto the signer's own body. Copyright © 2016 Cognitive Science Society, Inc.
Flipped Learning With Simulation in Undergraduate Nursing Education.
Kim, HeaRan; Jang, YounKyoung
2017-06-01
Flipped learning has proliferated in various educational environments. This study aimed to verify the effects of flipped learning on the academic achievement, teamwork skills, and satisfaction levels of undergraduate nursing students. For the flipped learning group, simulation-based education via the flipped learning method was provided, whereas traditional, simulation-based education was provided for the control group. After completion of the program, academic achievement, teamwork skills, and satisfaction levels were assessed and analyzed. The flipped learning group received higher scores on academic achievement, teamwork skills, and satisfaction levels than the control group, including the areas of content knowledge and clinical nursing practice competency. In addition, this difference gradually increased between the two groups throughout the trial. The results of this study demonstrated the positive, statistically significant effects of the flipped learning method on simulation-based nursing education. [J Nurs Educ. 2017;56(6):329-336.]. Copyright 2017, SLACK Incorporated.
The design and development of a triaxial wear-testing joint simulator.
Green, A S; O'Connell, M K; Lyons, A S; James, S P
1999-01-01
Most of the existing wear testers created to wear test total hip replacements, specifically the acetabular component, are designed to exert only an axial force and provide rotation in a close approximation of the actual femoral movement. The Rocky Mountain Joint Simulator was designed to exert three orthogonal forces and provide rotations about the X-, Y- and Z-axes to more closely simulate the physiological forces and motions found in the human gait cycle. The RMJS was also designed with adaptability for other joints, such as knees or canine hips, through the use of hydraulics and a computer-programmable control system. Such adaptability and functionality allows the researcher to more closely model a gait cycle, thereby obtaining wear patterns that resemble those found in retrieved implants more closely than existing simulators. Research is ongoing into the tuning and evaluation of the machine and preliminary acetabular component wear test results will be presented at the conference.
Cosmic reionization on computers: The faint end of the galaxy luminosity function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gnedin, Nickolay Y.
Using numerical cosmological simulations completed under the “Cosmic Reionization On Computers” project, I explore theoretical predictions for the faint end of the galaxy UV luminosity functions atmore » $$z\\gtrsim 6$$. A commonly used Schechter function approximation with the magnitude cut at $${M}_{{\\rm{cut}}}\\sim -13$$ provides a reasonable fit to the actual luminosity function of simulated galaxies. When the Schechter functional form is forced on the luminosity functions from the simulations, the magnitude cut $${M}_{{\\rm{cut}}}$$ is found to vary between -12 and -14 with a mild redshift dependence. Here, an analytical model of reionization from Madau et al., as used by Robertson et al., provides a good description of the simulated results, which can be improved even further by adding two physically motivated modifications to the original Madau et al. equation.« less
Reconstructing a Large-Scale Population for Social Simulation
NASA Astrophysics Data System (ADS)
Fan, Zongchen; Meng, Rongqing; Ge, Yuanzheng; Qiu, Xiaogang
The advent of social simulation has provided an opportunity to research on social systems. More and more researchers tend to describe the components of social systems in a more detailed level. Any simulation needs the support of population data to initialize and implement the simulation systems. However, it's impossible to get the data which provide full information about individuals and households. We propose a two-step method to reconstruct a large-scale population for a Chinese city according to Chinese culture. Firstly, a baseline population is generated through gathering individuals into households one by one; secondly, social relationships such as friendship are assigned to the baseline population. Through a case study, a population of 3,112,559 individuals gathered in 1,133,835 households is reconstructed for Urumqi city, and the results show that the generated data can respect the real data quite well. The generated data can be applied to support modeling of some social phenomenon.
Simulation of Smart Home Activity Datasets.
Synnott, Jonathan; Nugent, Chris; Jeffers, Paul
2015-06-16
A globally ageing population is resulting in an increased prevalence of chronic conditions which affect older adults. Such conditions require long-term care and management to maximize quality of life, placing an increasing strain on healthcare resources. Intelligent environments such as smart homes facilitate long-term monitoring of activities in the home through the use of sensor technology. Access to sensor datasets is necessary for the development of novel activity monitoring and recognition approaches. Access to such datasets is limited due to issues such as sensor cost, availability and deployment time. The use of simulated environments and sensors may address these issues and facilitate the generation of comprehensive datasets. This paper provides a review of existing approaches for the generation of simulated smart home activity datasets, including model-based approaches and interactive approaches which implement virtual sensors, environments and avatars. The paper also provides recommendation for future work in intelligent environment simulation.
Real simulation tools in introductory courses: packaging and repurposing our research code.
NASA Astrophysics Data System (ADS)
Heagy, L. J.; Cockett, R.; Kang, S.; Oldenburg, D.
2015-12-01
Numerical simulations are an important tool for scientific research and applications in industry. They provide a means to experiment with physics in a tangible, visual way, often providing insights into the problem. Over the last two years, we have been developing course and laboratory materials for an undergraduate geophysics course primarily taken by non-geophysics majors, including engineers and geologists. Our aim is to provide the students with resources to build intuition about geophysical techniques, promote curiosity driven exploration, and help them develop the skills necessary to communicate across disciplines. Using open-source resources and our existing research code, we have built modules around simulations, with supporting content to give student interactive tools for exploration into the impacts of input parameters and visualization of the resulting fields, fluxes and data for a variety of problems in applied geophysics, including magnetics, seismic, electromagnetics, and direct current resistivity. The content provides context for the problems, along with exercises that are aimed at getting students to experiment and ask 'what if...?' questions. In this presentation, we will discuss our approach for designing the structure of the simulation-based modules, the resources we have used, challenges we have encountered, general feedback from students and instructors, as well as our goals and roadmap for future improvement. We hope that our experiences and approach will be beneficial to other instructors who aim to put simulation tools in the hands of students.
WholeCellSimDB: a hybrid relational/HDF database for whole-cell model predictions
Karr, Jonathan R.; Phillips, Nolan C.; Covert, Markus W.
2014-01-01
Mechanistic ‘whole-cell’ models are needed to develop a complete understanding of cell physiology. However, extracting biological insights from whole-cell models requires running and analyzing large numbers of simulations. We developed WholeCellSimDB, a database for organizing whole-cell simulations. WholeCellSimDB was designed to enable researchers to search simulation metadata to identify simulations for further analysis, and quickly slice and aggregate simulation results data. In addition, WholeCellSimDB enables users to share simulations with the broader research community. The database uses a hybrid relational/hierarchical data format architecture to efficiently store and retrieve both simulation setup metadata and results data. WholeCellSimDB provides a graphical Web-based interface to search, browse, plot and export simulations; a JavaScript Object Notation (JSON) Web service to retrieve data for Web-based visualizations; a command-line interface to deposit simulations; and a Python API to retrieve data for advanced analysis. Overall, we believe WholeCellSimDB will help researchers use whole-cell models to advance basic biological science and bioengineering. Database URL: http://www.wholecellsimdb.org Source code repository URL: http://github.com/CovertLab/WholeCellSimDB PMID:25231498
Statistical variances of diffusional properties from ab initio molecular dynamics simulations
NASA Astrophysics Data System (ADS)
He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei
2018-12-01
Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Liu, Nan-Suey
2009-01-01
Very large eddy simulation (VLES) of the nonreacting turbulent flow in a single-element lean direct injection (LDI) combustor has been successfully performed via the approach known as the partially resolved numerical simulation (PRNS/VLES) using a nonlinear subscale model. The grid is the same as the one used in a previous RANS simulation, which was considered as too coarse for a traditional LES simulation. In this study, we first carry out a steady RANS simulation to provide the initial flow field for the subsequent PRNS/VLES simulation. We have also carried out an unsteady RANS (URANS) simulation for the purpose of comparing its results with that of the PRNS/VLES simulation. In addition, these calculated results are compared with the experimental data. The present effort has demonstrated that the PRNS/VLES approach, while using a RANS type of grid, is able to reveal the dynamically important, unsteady large-scale turbulent structures occurring in the flow field of a single-element LDI combustor. The interactions of these coherent structures play a critical role in the dispersion of the fuel, hence, the mixing between the fuel and the oxidizer in a combustor.
Ganesan, Prasanth; Shillieto, Kristina E.; Ghoraani, Behnaz
2018-01-01
Cardiac simulations play an important role in studies involving understanding and investigating the mechanisms of cardiac arrhythmias. Today, studies of arrhythmogenesis and maintenance are largely being performed by creating simulations of a particular arrhythmia with high accuracy comparable to the results of clinical experiments. Atrial fibrillation (AF), the most common arrhythmia in the United States and many other parts of the world, is one of the major field where simulation and modeling is largely used. AF simulations not only assist in understanding its mechanisms but also help to develop, evaluate and improve the computer algorithms used in electrophysiology (EP) systems for ablation therapies. In this paper, we begin with a brief overeview of some common techniques used in simulations to simulate two major AF mechanisms – spiral waves (or rotors) and point (or focal) sources. We particularly focus on 2D simulations using Nygren et al.’s mathematical model of human atrial cell. Then, we elucidate an application of the developed AF simulation to an algorithm designed for localizing AF rotors for improving current AF ablation therapies. Our simulation methods and results, along with the other discussions presented in this paper is aimed to provide engineers and professionals with a working-knowledge of application-specific simulations of spirals and foci. PMID:29629398
Mixed reality ventriculostomy simulation: experience in neurosurgical residency.
Hooten, Kristopher G; Lister, J Richard; Lombard, Gwen; Lizdas, David E; Lampotang, Samsun; Rajon, Didier A; Bova, Frank; Murad, Gregory J A
2014-12-01
Medicine and surgery are turning toward simulation to improve on limited patient interaction during residency training. Many simulators today use virtual reality with augmented haptic feedback with little to no physical elements. In a collaborative effort, the University of Florida Department of Neurosurgery and the Center for Safety, Simulation & Advanced Learning Technologies created a novel "mixed" physical and virtual simulator to mimic the ventriculostomy procedure. The simulator contains all the physical components encountered for the procedure with superimposed 3-D virtual elements for the neuroanatomical structures. To introduce the ventriculostomy simulator and its validation as a necessary training tool in neurosurgical residency. We tested the simulator in more than 260 residents. An algorithm combining time and accuracy was used to grade performance. Voluntary postperformance surveys were used to evaluate the experience. Results demonstrate that more experienced residents have statistically significant better scores and completed the procedure in less time than inexperienced residents. Survey results revealed that most residents agreed that practice on the simulator would help with future ventriculostomies. This mixed reality simulator provides a real-life experience, and will be an instrumental tool in training the next generation of neurosurgeons. We have now implemented a standard where incoming residents must prove efficiency and skill on the simulator before their first interaction with a patient.
Comparing Macroscale and Microscale Simulations of Porous Battery Electrodes
Higa, Kenneth; Wu, Shao-Ling; Parkinson, Dilworth Y.; ...
2017-06-22
This article describes a vertically-integrated exploration of NMC electrode rate limitations, combining experiments with corresponding macroscale (macro-homogeneous) and microscale models. Parameters common to both models were obtained from experiments or based on published results. Positive electrode tortuosity was the sole fitting parameter used in the macroscale model, while the microscale model used no fitting parameters, instead relying on microstructural domains generated from X-ray microtomography of pristine electrode material held under compression while immersed in electrolyte solution (additionally providing novel observations of electrode wetting). Macroscale simulations showed that the capacity decrease observed at higher rates resulted primarily from solution-phase diffusion resistance.more » This ability to provide such qualitative insights at low computational costs is a strength of macroscale models, made possible by neglecting electrode spatial details. To explore the consequences of such simplification, the corresponding, computationally-expensive microscale model was constructed. This was found to have limitations preventing quantitatively accurate predictions, for reasons that are discussed in the hope of guiding future work. Nevertheless, the microscale simulation results complement those of the macroscale model by providing a reality-check based on microstructural information; in particular, this novel comparison of the two approaches suggests a reexamination of salt diffusivity measurements.« less
Voss, Clifford I.; Simmons, Craig T.; Robinson, Neville I.
2010-01-01
This benchmark for three-dimensional (3D) numerical simulators of variable-density groundwater flow and solute or energy transport consists of matching simulation results with the semi-analytical solution for the transition from one steady-state convective mode to another in a porous box. Previous experimental and analytical studies of natural convective flow in an inclined porous layer have shown that there are a variety of convective modes possible depending on system parameters, geometry and inclination. In particular, there is a well-defined transition from the helicoidal mode consisting of downslope longitudinal rolls superimposed upon an upslope unicellular roll to a mode consisting of purely an upslope unicellular roll. Three-dimensional benchmarks for variable-density simulators are currently (2009) lacking and comparison of simulation results with this transition locus provides an unambiguous means to test the ability of such simulators to represent steady-state unstable 3D variable-density physics.
Experiments and FEM simulations of fracture behaviors for ADC12 aluminum alloy under impact load
NASA Astrophysics Data System (ADS)
Hu, Yumei; Xiao, Yue; Jin, Xiaoqing; Zheng, Haoran; Zhou, Yinge; Shao, Jinhua
2016-11-01
Using the combination of experiment and simulation, the fracture behavior of the brittle metal named ADC12 aluminum alloy was studied. Five typical experiments were carried out on this material, with responding data collected on different stress states and dynamic strain rates. Fractographs revealed that the morphologies of fractured specimen under several rates showed different results, indicating that the fracture was predominantly a brittle one in nature. Simulations of the fracture processes of those specimens were conducted by Finite Element Method, whilst consistency was observed between simulations and experiments. In simulation, the Johnson- Cook model was chosen to describe the damage development and to predict the failure using parameters determined from those experimental data. Subsequently, an ADC12 engine mount bracket crashing simulation was conducted and the results indicated good agreement with the experiments. The accordance showed that our research can provide an accurate description for the deforming and fracture processes of the studied alloy.
Numerical simulations of crystal growth in a transdermal drug delivery system
NASA Astrophysics Data System (ADS)
Zeng, Jianming; Jacob, Karl I.; Tikare, Veena
2004-02-01
Grain growth by precipitation and Ostwald ripening in an unstressed matrix of a dissolved crystallizable component was simulated using a kinetic Monte Carlo model. This model was used previously to study Ostwald ripening in the high crystallizable component regime and was shown to correctly simulate solution, diffusion and precipitation. In this study, the same model with modifications was applied to the low crystallizable regime of interest to the transdermal drug delivery system (TDS) community. We demonstrate the model's utility by simulating precipitation and grain growth during isothermal storage at different supersaturation conditions. The simulation results provide a first approximation for the crystallization occurring in TDS. It has been reported that for relatively higher temperature growth of drug crystals in TDS occurs only in the middle third of the polymer layer. The results from the simulations support these findings that crystal growth is limited to the middle third of the region, where the availability of crystallizable components is the highest, for cluster growth at relatively high temperature.
Numerical simulation of the effect of regular and sub-caliber projectiles on military bunkers
NASA Astrophysics Data System (ADS)
Jiricek, Pavel; Foglar, Marek
2015-09-01
One of the most demanding topics in blast and impact engineering is the modelling of projectile impact. To introduce this topic, a set of numerical simulations was undertaken. The simulations study the impact of regular and sub-calibre projectile on Czech pre-WW2 military bunkers. The penetrations of the military objects are well documented and can be used for comparison. The numerical model composes of a part from a wall of a military object. The concrete block is subjected to an impact of a regular and sub-calibre projectile. The model is divided into layers to simplify the evaluation of the results. The simulations are processed within ANSYS AUTODYN software. A nonlinear material model of with damage and incorporated strain-rate effect was used. The results of the numerical simulations are evaluated in means of the damage of the concrete block. Progress of the damage is described versus time. The numerical simulation provides good agreement with the documented penetrations.
Science yield modeling with the Exoplanet Open-Source Imaging Mission Simulator (EXOSIMS)
NASA Astrophysics Data System (ADS)
Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Morgan, Rhonda
2016-08-01
We report on our ongoing development of EXOSIMS and mission simulation results for WFIRST. We present the interface control and the modular structure of the software, along with corresponding prototypes and class definitions for some of the software modules. More specifically, we focus on describing the main steps of our high-fidelity mission simulator EXOSIMS, i.e., the completeness, optical system and zodiacal light modules definition, the target list module filtering, and the creation of a planet population within our simulated universe module. For the latter, we introduce the integration of a recent mass-radius model from the FORECASTER software. We also provide custom modules dedicated to WFIRST using both the Hybrid Lyot Coronagraph (HLC) and the Shaped Pupil Coronagraph (SPC) for detection and characterization, respectively. In that context, we show and discuss the results of some preliminary WFIRST simulations, focusing on comparing different methods of integration time calculation, through ensembles (large numbers) of survey simulations.
Numerical simulations in the development of propellant management devices
NASA Astrophysics Data System (ADS)
Gaulke, Diana; Winkelmann, Yvonne; Dreyer, Michael
Propellant management devices (PMDs) are used for positioning the propellant at the propel-lant port. It is important to provide propellant without gas bubbles. Gas bubbles can inflict cavitation and may lead to system failures in the worst case. Therefore, the reliable operation of such devices must be guaranteed. Testing these complex systems is a very intricate process. Furthermore, in most cases only tests with downscaled geometries are possible. Numerical sim-ulations are used here as an aid to optimize the tests and to predict certain results. Based on these simulations, parameters can be determined in advance and parts of the equipment can be adjusted in order to minimize the number of experiments. In return, the simulations are validated regarding the test results. Furthermore, if the accuracy of the numerical prediction is verified, then numerical simulations can be used for validating the scaling of the experiments. This presentation demonstrates some selected numerical simulations for the development of PMDs at ZARM.
Dynamics Modeling and Simulation of Large Transport Airplanes in Upset Conditions
NASA Technical Reports Server (NTRS)
Foster, John V.; Cunningham, Kevin; Fremaux, Charles M.; Shah, Gautam H.; Stewart, Eric C.; Rivers, Robert A.; Wilborn, James E.; Gato, William
2005-01-01
As part of NASA's Aviation Safety and Security Program, research has been in progress to develop aerodynamic modeling methods for simulations that accurately predict the flight dynamics characteristics of large transport airplanes in upset conditions. The motivation for this research stems from the recognition that simulation is a vital tool for addressing loss-of-control accidents, including applications to pilot training, accident reconstruction, and advanced control system analysis. The ultimate goal of this effort is to contribute to the reduction of the fatal accident rate due to loss-of-control. Research activities have involved accident analyses, wind tunnel testing, and piloted simulation. Results have shown that significant improvements in simulation fidelity for upset conditions, compared to current training simulations, can be achieved using state-of-the-art wind tunnel testing and aerodynamic modeling methods. This paper provides a summary of research completed to date and includes discussion on key technical results, lessons learned, and future research needs.
Modeling unstable alcohol flooding of DNAPL-contaminated columns
NASA Astrophysics Data System (ADS)
Roeder, Eberhard; Falta, Ronald W.
Alcohol flooding, consisting of injection of a mixture of alcohol and water, is one source removal technology for dense non-aqueous phase liquids (DNAPLs) currently under investigation. An existing compositional multiphase flow simulator (UTCHEM) was adapted to accurately represent the equilibrium phase behavior of ternary and quaternary alcohol/DNAPL systems. Simulator predictions were compared to laboratory column experiments and the results are presented here. It was found that several experiments involved unstable displacements of the NAPL bank by the alcohol flood or of the alcohol flood by the following water flood. Unstable displacement led to additional mixing compared to ideal displacement. This mixing was approximated by a large dispersion in one-dimensional simulations and or by including permeability heterogeneities on a very small scale in three-dimensional simulations. Three-dimensional simulations provided the best match. Simulations of unstable displacements require either high-resolution grids, or need to consider the mixing of fluids in a different manner to capture the resulting effects on NAPL recovery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-11-01
Massive first-principles simulation provides insight into flame anchoring in a hydrogen-rich jet in cross-flow. When gas turbine designers want to use gasified biomass for stationary power generation, they are faced with a challenge: bio-derived syngas typically contains significant amounts of hydrogen, which is far more reactive than the methane that is the traditional gas turbine fuel. This reactivity leads to a safety design issue, because with hydrogen-rich fuels a flame may anchor in the fuel injection section of the combustor instead of the downstream design point. In collaboration with Jacqueline Chen of Sandia National Laboratories and Andrea Gruber of SINTEF,more » a Norwegian energy think tank, the National Renewable Energy Laboratory (NREL) is carrying out fundamental simulations to provide new insight into the physics of flame anchoring in canonical 'jet in cross-flow' configurations using hydrogen-rich fuels. To deal with the large amount and complexity of the data, the combustion scientists also teamed up with computer scientists from across the U.S. Department of Energy's laboratories to develop novel ways to analyze the data. These simulations have shown that fine-scale turbulence structures formed at the jet boundary provide particularly intense mixing between the fuel and air, which then enters a quiescent region formed downstream of the jet in a separate, larger turbulent structure. This insight explains the effect that reducing the wall-normal velocity of the fuel jet causes the flame to blow off; with the aid of the simulation, we now understand this counterintuitive result because reducing the wall-normal velocity would reduce the intensity of the mixing as well as move the quiescent region farther downstream. NREL and its research partners are conducting simulations that provide new insight into the physics of flame anchoring in canonical 'jet in cross-flow' configurations using hydrogen-rich fuels. Simulation results explain the mechanism behind flame blow-off occurring when a component in the cross-flow direction is progressively added to the jet velocity vector, thereby reducing the relative impact of its wall-normal velocity component. Understanding the mechanism for flame anchoring aids the design of fuel injection nozzles that meet safety requirements when using hydrogen-rich fuels.« less
TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenwood, Michael S; Cetiner, Mustafa S; Fugate, David L
Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and supportmore » tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.« less
Cluster-Expansion Model for Complex Quinary Alloys: Application to Alnico Permanent Magnets
NASA Astrophysics Data System (ADS)
Nguyen, Manh Cuong; Zhou, Lin; Tang, Wei; Kramer, Matthew J.; Anderson, Iver E.; Wang, Cai-Zhuang; Ho, Kai-Ming
2017-11-01
An accurate and transferable cluster-expansion model for complex quinary alloys is developed. Lattice Monte Carlo simulation enabled by this cluster-expansion model is used to investigate temperature-dependent atomic structure of alnico alloys, which are considered as promising high-performance non-rare-earth permanent-magnet materials for high-temperature applications. The results of the Monte Carlo simulations are consistent with available experimental data and provide useful insights into phase decomposition, selection, and chemical ordering in alnico. The simulations also reveal a previously unrecognized D 03 alloy phase. This phase is very rich in Ni and exhibits very weak magnetization. Manipulating the size and location of this phase provides a possible route to improve the magnetic properties of alnico, especially coercivity.
A fully coupled flow simulation around spacecraft in low earth orbit
NASA Technical Reports Server (NTRS)
Justiz, C. R.; Sega, R. M.
1991-01-01
The primary objective of this investigation is to provide a full flow simulation of a spacecraft in low earth orbit (LEO). Due to the nature of the environment, the simulation includes the highly coupled effects of neutral particle flow, free stream plasma flow, nonequilibrium gas dynamics effects, spacecraft charging and electromagnetic field effects. Emphasis is placed on the near wake phenomenon and will be verified in space by the Wake Shield Facility (WSF) and developed for application to Space Station conditions as well as for other spacecraft. The WSF is a metallic disk-type structure that will provide a controlled space platform for highly accurate measurements. Preliminary results are presented for a full flow around a metallic disk.
Tomographic inversion of satellite photometry. II
NASA Technical Reports Server (NTRS)
Solomon, S. C.; Hays, P. B.; Abreu, V. J.
1985-01-01
A method for combining nadir observations of emission features in the upper atmosphere with the result of a tomographic inversion of limb brightness measurements is presented. Simulated and actual results are provided, and error sensitivity is investigated.
ERIC Educational Resources Information Center
Jackett, Dwane
1990-01-01
Described is a science activity which illustrates the principle of uncertainty using a computer simulation of bacterial reproduction. Procedures and results are discussed. Several illustrations of results are provided. The availability of a computer program is noted. (CW)
Global Magnetosphere Modeling With Kinetic Treatment of Magnetic Reconnection
NASA Astrophysics Data System (ADS)
Toth, G.; Chen, Y.; Gombosi, T. I.; Cassak, P.; Markidis, S.; Peng, B.; Henderson, M. G.
2017-12-01
Global magnetosphere simulations with a kinetic treatment of magnetic reconnection are very challenging because of the large separation of global and kinetic scales. We have developed two algorithms that can overcome these difficulties: 1) the two-way coupling of the global magnetohydrodynamic code with an embedded particle-in-cell model (MHD-EPIC) and 2) the artificial increase of the ion and electron kinetic scales. Both of these techniques improve the efficiency of the simulations by many orders of magnitude. We will describe the techniques and show that they provide correct and meaningful results. Using the coupled model and the increased kinetic scales, we will present global magnetosphere simulations with the PIC domains covering the dayside and/or tail reconnection sites. The simulation results will be compared to and validated with MMS observations.
Numerical simulation of plasma processes driven by transverse ion heating
NASA Technical Reports Server (NTRS)
Singh, Nagendra; Chan, C. B.
1993-01-01
The plasma processes driven by transverse ion heating in a diverging flux tube are investigated with numerical simulation. The heating is found to drive a host of plasma processes, in addition to the well-known phenomenon of ion conics. The downward electric field near the reverse shock generates a doublestreaming situation consisting of two upflowing ion populations with different average flow velocities. The electric field in the reverse shock region is modulated by the ion-ion instability driven by the multistreaming ions. The oscillating fields in this region have the possibility of heating electrons. These results from the simulations are compared with results from a previous study based on a hydrodynamical model. Effects of spatial resolutions provided by simulations on the evolution of the plasma are discussed.
Waterhammer Transient Simulation and Model Anchoring for the Robotic Lunar Lander Propulsion System
NASA Technical Reports Server (NTRS)
Stein, William B.; Trinh, Huu P.; Reynolds, Michael E.; Sharp, David J.
2011-01-01
Waterhammer transients have the potential to adversely impact propulsion system design if not properly addressed. Waterhammer can potentially lead to system plumbing, and component damage. Multi-thruster propulsion systems also develop constructive/destructive wave interference which becomes difficult to predict without detailed models. Therefore, it is important to sufficiently characterize propulsion system waterhammer in order to develop a robust design with minimal impact to other systems. A risk reduction activity was performed at Marshall Space Flight Center to develop a tool for estimating waterhammer through the use of anchored simulation for the Robotic Lunar Lander (RLL) propulsion system design. Testing was performed to simulate waterhammer surges due to rapid valve closure and consisted of twenty-two series of waterhammer tests, resulting in more than 300 valve actuations. These tests were performed using different valve actuation schemes and three system pressures. Data from the valve characterization tests were used to anchor the models that employed MSCSoftware.EASY5 v.2010 to model transient fluid phenomena by using transient forms of mass and energy conservation. The anchoring process was performed by comparing initial model results to experimental data and then iterating the model input to match the simulation results with the experimental data. The models provide good correlation with experimental results, supporting the use of EASY5 as a tool to model fluid transients and provide a baseline for future RLL system modeling. This paper addresses tasks performed during the waterhammer risk reduction activity for the RLL propulsion system. The problem of waterhammer simulation anchoring as applied to the RLL system is discussed with results from the corresponding experimental valve tests. Important factors for waterhammer mitigation are discussed along with potential design impacts to the RLL propulsion system.
Speed Control Law for Precision Terminal Area In-Trail Self Spacing
NASA Technical Reports Server (NTRS)
Abbott, Terence S.
2002-01-01
This document describes a speed control law for precision in-trail airborne self-spacing during final approach. This control law was designed to provide an operationally viable means to obtain a desired runway threshold crossing time or minimum distance, one aircraft relative to another. The control law compensates for dissimilar final approach speeds between aircraft pairs and provides guidance for a stable final approach. This algorithm has been extensively tested in Monte Carlo simulation and has been evaluated in piloted simulation, with preliminary results indicating acceptability from operational and workload standpoints.
Mesoscale energy deposition footprint model for kiloelectronvolt cluster bombardment of solids.
Russo, Michael F; Garrison, Barbara J
2006-10-15
Molecular dynamics simulations have been performed to model 5-keV C60 and Au3 projectile bombardment of an amorphous water substrate. The goal is to obtain detailed insights into the dynamics of motion in order to develop a straightforward and less computationally demanding model of the process of ejection. The molecular dynamics results provide the basis for the mesoscale energy deposition footprint model. This model provides a method for predicting relative yields based on information from less than 1 ps of simulation time.
Design for progressive fracture in composite shell structures
NASA Technical Reports Server (NTRS)
Minnetyan, Levon; Murthy, Pappu L. N.
1992-01-01
The load carrying capability and structural behavior of composite shell structures and stiffened curved panels are investigated to provide accurate early design loads. An integrated computer code is utilized for the computational simulation of composite structural degradation under practical loading for realistic design. Damage initiation, growth, accumulation, and propagation to structural fracture are included in the simulation. Progressive fracture investigations providing design insight for several classes of composite shells are presented. Results demonstrate the significance of local defects, interfacial regions, and stress concentrations on the structural durability of composite shells.
Simulation of urban land surface temperature based on sub-pixel land cover in a coastal city
NASA Astrophysics Data System (ADS)
Zhao, Xiaofeng; Deng, Lei; Feng, Huihui; Zhao, Yanchuang
2014-11-01
The sub-pixel urban land cover has been proved to have obvious correlations with land surface temperature (LST). Yet these relationships have seldom been used to simulate LST. In this study we provided a new approach of urban LST simulation based on sub-pixel land cover modeling. Landsat TM/ETM+ images of Xiamen city, China on both the January of 2002 and 2007 were used to acquire land cover and then extract the transformation rule using logistic regression. The transformation possibility was taken as its percent in the same pixel after normalization. And cellular automata were used to acquire simulated sub-pixel land cover on 2007 and 2017. On the other hand, the correlations between retrieved LST and sub-pixel land cover achieved by spectral mixture analysis in 2002 were examined and a regression model was built. Then the regression model was used on simulated 2007 land cover to model the LST of 2007. Finally the LST of 2017 was simulated for urban planning and management. The results showed that our method is useful in LST simulation. Although the simulation accuracy is not quite satisfactory, it provides an important idea and a good start in the modeling of urban LST.
User's guide to resin infusion simulation program in the FORTRAN language
NASA Technical Reports Server (NTRS)
Weideman, Mark H.; Hammond, Vince H.; Loos, Alfred C.
1992-01-01
RTMCL is a user friendly computer code which simulates the manufacture of fabric composites by the resin infusion process. The computer code is based on the process simulation model described in reference 1. Included in the user's guide is a detailed step by step description of how to run the program and enter and modify the input data set. Sample input and output files are included along with an explanation of the results. Finally, a complete listing of the program is provided.
2015-08-27
applied reverse voltage [8], [9]. In this report, the experimental results of a varactor diode NLTL built with 30 sections are presented. Besides, Spice ...capacitive line (NLCL) using commercial BT and PZT ceramic capacitors. Corresponding NLCL Spice simulation is provided for comparison with experimental...the output pulse. In special for PZT, Spice simulation of a line with respective linear capacitors illustrates its weak nonlinearity as the
Aeroacoustic and Performance Simulations of a Test Scale Open Rotor
NASA Technical Reports Server (NTRS)
Claus, Russell W.
2013-01-01
This paper explores a comparison between experimental data and numerical simulations of the historical baseline F31/A31 open rotor geometry. The experimental data were obtained at the NASA Glenn Research Center s Aeroacoustic facility and include performance and noise information for a variety of flow speeds (matching take-off and cruise). The numerical simulations provide both performance and aeroacoustic results using the NUMECA s Fine-Turbo analysis code. A non-linear harmonic method is used to capture the rotor/rotor interaction.
Investigation of Asymmetric Thrust Detection with Demonstration in a Real-Time Simulation Testbed
NASA Technical Reports Server (NTRS)
Chicatelli, Amy; Rinehart, Aidan W.; Sowers, T. Shane; Simon, Donald L.
2015-01-01
The purpose of this effort is to develop, demonstrate, and evaluate three asymmetric thrust detection approaches to aid in the reduction of asymmetric thrust-induced aviation accidents. This paper presents the results from that effort and their evaluation in simulation studies, including those from a real-time flight simulation testbed. Asymmetric thrust is recognized as a contributing factor in several Propulsion System Malfunction plus Inappropriate Crew Response (PSM+ICR) aviation accidents. As an improvement over the state-of-the-art, providing annunciation of asymmetric thrust to alert the crew may hold safety benefits. For this, the reliable detection and confirmation of asymmetric thrust conditions is required. For this work, three asymmetric thrust detection methods are presented along with their results obtained through simulation studies. Representative asymmetric thrust conditions are modeled in simulation based on failure scenarios similar to those reported in aviation incident and accident descriptions. These simulated asymmetric thrust scenarios, combined with actual aircraft operational flight data, are then used to conduct a sensitivity study regarding the detection capabilities of the three methods. Additional evaluation results are presented based on pilot-in-the-loop simulation studies conducted in the NASA Glenn Research Center (GRC) flight simulation testbed. Data obtained from this flight simulation facility are used to further evaluate the effectiveness and accuracy of the asymmetric thrust detection approaches. Generally, the asymmetric thrust conditions are correctly detected and confirmed.
Investigation of Asymmetric Thrust Detection with Demonstration in a Real-Time Simulation Testbed
NASA Technical Reports Server (NTRS)
Chicatelli, Amy K.; Rinehart, Aidan W.; Sowers, T. Shane; Simon, Donald L.
2016-01-01
The purpose of this effort is to develop, demonstrate, and evaluate three asymmetric thrust detection approaches to aid in the reduction of asymmetric thrust-induced aviation accidents. This paper presents the results from that effort and their evaluation in simulation studies, including those from a real-time flight simulation testbed. Asymmetric thrust is recognized as a contributing factor in several Propulsion System Malfunction plus Inappropriate Crew Response (PSM+ICR) aviation accidents. As an improvement over the state-of-the-art, providing annunciation of asymmetric thrust to alert the crew may hold safety benefits. For this, the reliable detection and confirmation of asymmetric thrust conditions is required. For this work, three asymmetric thrust detection methods are presented along with their results obtained through simulation studies. Representative asymmetric thrust conditions are modeled in simulation based on failure scenarios similar to those reported in aviation incident and accident descriptions. These simulated asymmetric thrust scenarios, combined with actual aircraft operational flight data, are then used to conduct a sensitivity study regarding the detection capabilities of the three methods. Additional evaluation results are presented based on pilot-in-the-loop simulation studies conducted in the NASA Glenn Research Center (GRC) flight simulation testbed. Data obtained from this flight simulation facility are used to further evaluate the effectiveness and accuracy of the asymmetric thrust detection approaches. Generally, the asymmetric thrust conditions are correctly detected and confirmed.
NASA Astrophysics Data System (ADS)
Núñez, M.; Robie, T.; Vlachos, D. G.
2017-10-01
Kinetic Monte Carlo (KMC) simulation provides insights into catalytic reactions unobtainable with either experiments or mean-field microkinetic models. Sensitivity analysis of KMC models assesses the robustness of the predictions to parametric perturbations and identifies rate determining steps in a chemical reaction network. Stiffness in the chemical reaction network, a ubiquitous feature, demands lengthy run times for KMC models and renders efficient sensitivity analysis based on the likelihood ratio method unusable. We address the challenge of efficiently conducting KMC simulations and performing accurate sensitivity analysis in systems with unknown time scales by employing two acceleration techniques: rate constant rescaling and parallel processing. We develop statistical criteria that ensure sufficient sampling of non-equilibrium steady state conditions. Our approach provides the twofold benefit of accelerating the simulation itself and enabling likelihood ratio sensitivity analysis, which provides further speedup relative to finite difference sensitivity analysis. As a result, the likelihood ratio method can be applied to real chemistry. We apply our methodology to the water-gas shift reaction on Pt(111).
NASA Technical Reports Server (NTRS)
Christhilf, David M.
2014-01-01
It has long been recognized that frequency and phasing of structural modes in the presence of airflow play a fundamental role in the occurrence of flutter. Animation of simulation results for the long, slender Semi-Span Super-Sonic Transport (S4T) wind-tunnel model demonstrates that, for the case of mass-ballasted nacelles, the flutter mode can be described as a traveling wave propagating downstream. Such a characterization provides certain insights, such as (1) describing the means by which energy is transferred from the airflow to the structure, (2) identifying airspeed as an upper limit for speed of wave propagation, (3) providing an interpretation for a companion mode that coalesces in frequency with the flutter mode but becomes very well damped, (4) providing an explanation for bursts of response to uniform turbulence, and (5) providing an explanation for loss of low frequency (lead) phase margin with increases in dynamic pressure (at constant Mach number) for feedback systems that use sensors located upstream from active control surfaces. Results from simulation animation, simplified modeling, and wind-tunnel testing are presented for comparison. The simulation animation was generated using double time-integration in Simulink of vertical accelerometer signals distributed over wing and fuselage, along with time histories for actuated control surfaces. Crossing points for a zero-elevation reference plane were tracked along a network of lines connecting the accelerometer locations. Accelerometer signals were used in preference to modal displacement state variables in anticipation that the technique could be used to animate motion of the actual wind-tunnel model using data acquired during testing. Double integration of wind-tunnel accelerometer signals introduced severe drift even with removal of both position and rate biases such that the technique does not currently work. Using wind-tunnel data to drive a Kalman filter based upon fitting coefficients to analytical mode shapes might provide a better means to animate the wind tunnel data.
Simulation in pediatric anesthesiology.
Fehr, James J; Honkanen, Anita; Murray, David J
2012-10-01
Simulation-based training, research and quality initiatives are expanding in pediatric anesthesiology just as in other medical specialties. Various modalities are available, from task trainers to standardized patients, and from computer-based simulations to mannequins. Computer-controlled mannequins can simulate pediatric vital signs with reasonable reliability; however the fidelity of skin temperature and color change, airway reflexes and breath and heart sounds remains rudimentary. Current pediatric mannequins are utilized in simulation centers, throughout hospitals in-situ, at national meetings for continuing medical education and in research into individual and team performance. Ongoing efforts by pediatric anesthesiologists dedicated to using simulation to improve patient care and educational delivery will result in further dissemination of this technology. Health care professionals who provide complex, subspecialty care to children require a curriculum supported by an active learning environment where skills directly relevant to pediatric care can be developed. The approach is not only the most effective method to educate adult learners, but meets calls for education reform and offers the potential to guide efforts toward evaluating competence. Simulation addresses patient safety imperatives by providing a method for trainees to develop skills and experience in various management strategies, without risk to the health and life of a child. A curriculum that provides pediatric anesthesiologists with the range of skills required in clinical practice settings must include a relatively broad range of task-training devises and electromechanical mannequins. Challenges remain in defining the best integration of this modality into training and clinical practice to meet the needs of pediatric patients. © 2012 Blackwell Publishing Ltd.
NASA Astrophysics Data System (ADS)
Tang, Qiuyan; Wang, Jing; Lv, Pin; Sun, Quan
2015-10-01
Propagation simulation method and choosing mesh grid are both very important to get the correct propagation results in wave optics simulation. A new angular spectrum propagation method with alterable mesh grid based on the traditional angular spectrum method and the direct FFT method is introduced. With this method, the sampling space after propagation is not limited to propagation methods no more, but freely alterable. However, choosing mesh grid on target board influences the validity of simulation results directly. So an adaptive mesh choosing method based on wave characteristics is proposed with the introduced propagation method. We can calculate appropriate mesh grids on target board to get satisfying results. And for complex initial wave field or propagation through inhomogeneous media, we can also calculate and set the mesh grid rationally according to above method. Finally, though comparing with theoretical results, it's shown that the simulation result with the proposed method coinciding with theory. And by comparing with the traditional angular spectrum method and the direct FFT method, it's known that the proposed method is able to adapt to a wider range of Fresnel number conditions. That is to say, the method can simulate propagation results efficiently and correctly with propagation distance of almost zero to infinity. So it can provide better support for more wave propagation applications such as atmospheric optics, laser propagation and so on.
Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0
NASA Technical Reports Server (NTRS)
Knox, J. C.
1996-01-01
The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.
Ribeiro de Oliveira, Marcelo Magaldi; Nicolato, Arthur; Santos, Marcilea; Godinho, Joao Victor; Brito, Rafael; Alvarenga, Alexandre; Martins, Ana Luiza Valle; Prosdocimi, André; Trivelato, Felipe Padovani; Sabbagh, Abdulrahman J; Reis, Augusto Barbosa; Maestro, Rolando Del
2016-05-01
OBJECT The development of neurointerventional treatments of central nervous system disorders has resulted in the need for adequate training environments for novice interventionalists. Virtual simulators offer anatomical definition but lack adequate tactile feedback. Animal models, which provide more lifelike training, require an appropriate infrastructure base. The authors describe a training model for neurointerventional procedures using the human placenta (HP), which affords haptic training with significantly fewer resource requirements, and discuss its validation. METHODS Twelve HPs were prepared for simulated endovascular procedures. Training exercises performed by interventional neuroradiologists and novice fellows were placental angiography, stent placement, aneurysm coiling, and intravascular liquid embolic agent injection. RESULTS The endovascular training exercises proposed can be easily reproduced in the HP. Face, content, and construct validity were assessed by 6 neurointerventional radiologists and 6 novice fellows in interventional radiology. CONCLUSIONS The use of HP provides an inexpensive training model for the training of neurointerventionalists. Preliminary validation results show that this simulation model has face and content validity and has demonstrated construct validity for the interventions assessed in this study.
Compressible Turbulent Channel Flows: DNS Results and Modeling
NASA Technical Reports Server (NTRS)
Huang, P. G.; Coleman, G. N.; Bradshaw, P.; Rai, Man Mohan (Technical Monitor)
1994-01-01
The present paper addresses some topical issues in modeling compressible turbulent shear flows. The work is based on direct numerical simulation of two supersonic fully developed channel flows between very cold isothermal walls. Detailed decomposition and analysis of terms appearing in the momentum and energy equations are presented. The simulation results are used to provide insights into differences between conventional time-and Favre-averaging of the mean-flow and turbulent quantities. Study of the turbulence energy budget for the two cases shows that the compressibility effects due to turbulent density and pressure fluctuations are insignificant. In particular, the dilatational dissipation and the mean product of the pressure and dilatation fluctuations are very small, contrary to the results of simulations for sheared homogeneous compressible turbulence and to recent proposals for models for general compressible turbulent flows. This provides a possible explanation of why the Van Driest density-weighted transformation is so successful in correlating compressible boundary layer data. Finally, it is found that the DNS data do not support the strong Reynolds analogy. A more general representation of the analogy is analysed and shown to match the DNS data very well.
eVolv2k: A new ice core-based volcanic forcing reconstruction for the past 2000 years
NASA Astrophysics Data System (ADS)
Toohey, Matthew; Sigl, Michael
2016-04-01
Radiative forcing resulting from stratospheric aerosols produced by major volcanic eruptions is a dominant driver of climate variability in the Earth's past. The ability of climate model simulations to accurately recreate past climate is tied directly to the accuracy of the volcanic forcing timeseries used in the simulations. We present here a new volcanic forcing reconstruction, based on newly updated ice core composites from Antarctica and Greenland. Ice core records are translated into stratospheric aerosol properties for use in climate models through the Easy Volcanic Aerosol (EVA) module, which provides an analytic representation of volcanic stratospheric aerosol forcing based on available observations and aerosol model results, prescribing the aerosol's radiative properties and primary modes of spatial and temporal variability. The evolv2k volcanic forcing dataset covers the past 2000 years, and has been provided for use in the Paleo-Modeling Intercomparison Project (PMIP), and VolMIP experiments within CMIP6. Here, we describe the construction of the eVolv2k data set, compare with prior forcing sets, and show initial simulation results.
Simulation of ICESat-2 canopy height retrievals for different ecosystems
NASA Astrophysics Data System (ADS)
Neuenschwander, A. L.
2016-12-01
Slated for launch in late 2017 (or early 2018), the ICESat-2 satellite will provide a global distribution of geodetic measurements from a space-based laser altimeter of both the terrain surface and relative canopy heights which will provide a significant benefit to society through a variety of applications ranging from improved global digital terrain models to producing distribution of above ground vegetation structure. The ATLAS instrument designed for ICESat-2, will utilize a different technology than what is found on most laser mapping systems. The photon counting technology of the ATLAS instrument onboard ICESat-2 will record the arrival time associated with a single photon detection. That detection can occur anywhere within the vertical distribution of the reflected signal, that is, anywhere within the vertical distribution of the canopy. This uncertainty of where the photon will be returned from within the vegetation layer is referred to as the vertical sampling error. Preliminary simulation studies to estimate vertical sampling error have been conducted for several ecosystems including woodland savanna, montane conifers, temperate hardwoods, tropical forest, and boreal forest. The results from these simulations indicate that the canopy heights reported on the ATL08 data product will underestimate the top canopy height in the range of 1 - 4 m. Although simulation results indicate the ICESat-2 will underestimate top canopy height, there is, however, a strong correlation between ICESat-2 heights and relative canopy height metrics (e.g. RH75, RH90). In tropical forest, simulation results indicate the ICESat-2 height correlates strongly with RH90. Similarly, in temperate broadleaf forest, the simulated ICESat-2 heights were also strongly correlated with RH90. In boreal forest, the simulated ICESat-2 heights are strongly correlated with RH75 heights. It is hypothesized that the correlations between simulated ICESat-2 heights and canopy height metrics are a function of both canopy cover and vegetation physiology (e.g. leaf size/shape) which contributes to the horizontal and vertical structure of the vegetation.
CLVTOPS Liftoff and Separation Analysis Validation Using Ares I-X Flight Data
NASA Technical Reports Server (NTRS)
Burger, Ben; Schwarz, Kristina; Kim, Young
2011-01-01
CLVTOPS is a multi-body time domain flight dynamics simulation tool developed by NASA s Marshall Space Flight Center (MSFC) for a space launch vehicle and is based on the TREETOPS simulation tool. CLVTOPS is currently used to simulate the flight dynamics and separation/jettison events of the Ares I launch vehicle including liftoff and staging separation. In order for CLVTOPS to become an accredited tool, validation against other independent simulations and real world data is needed. The launch of the Ares I-X vehicle (first Ares I test flight) on October 28, 2009 presented a great opportunity to provide validation evidence for CLVTOPS. In order to simulate the Ares I-X flight, specific models were implemented into CLVTOPS. These models include the flight day environment, reconstructed thrust, reconstructed mass properties, aerodynamics, and the Ares I-X guidance, navigation and control models. The resulting simulation output was compared to Ares I-X flight data. During the liftoff region of flight, trajectory states from the simulation and flight data were compared. The CLVTOPS results were used to make a semi-transparent animation of the vehicle that was overlaid directly on top of the flight video to provide a qualitative measure of the agreement between the simulation and the actual flight. During ascent, the trajectory states of the vehicle were compared with flight data. For the stage separation event, the trajectory states of the two stages were compared to available flight data. Since no quantitative rotational state data for the upper stage was available, the CLVTOPS results were used to make an animation of the two stages to show a side-by-side comparison with flight video. All of the comparisons between CLVTOPS and the flight data show good agreement. This paper documents comparisons between CLVTOPS and Ares I-X flight data which serve as validation evidence for the eventual accreditation of CLVTOPS.
Demonstration of a High-Order Mode Input Coupler for a 220-GHz Confocal Gyrotron Traveling Wave Tube
NASA Astrophysics Data System (ADS)
Guan, Xiaotong; Fu, Wenjie; Yan, Yang
2018-02-01
A design of high-order mode input coupler for 220-GHz confocal gyrotron travelling wave tube is proposed, simulated, and demonstrated by experimental tests. This input coupler is designed to excite confocal TE 06 mode from rectangle waveguide TE 10 mode over a broadband frequency range. Simulation results predict that the optimized conversion loss is about 2.72 dB with a mode purity excess of 99%. Considering of the gyrotron interaction theory, an effective bandwidth of 5 GHz is obtained, in which the beam-wave coupling efficiency is higher than half of maximum. The field pattern under low power demonstrates that TE 06 mode is successfully excited in confocal waveguide at 220 GHz. Cold test results from the vector network analyzer perform good agreements with simulation results. Both simulation and experimental results illustrate that the reflection at input port S11 is sensitive to the perpendicular separation of two mirrors. It provides an engineering possibility for estimating the assembly precision.
Mehl, S.; Hill, M.C.
2001-01-01
Five common numerical techniques for solving the advection-dispersion equation (finite difference, predictor corrector, total variation diminishing, method of characteristics, and modified method of characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using discrete, randomly distributed, homogeneous blocks of five sand types. This experimental model provides an opportunity to compare the solution techniques: the heterogeneous hydraulic-conductivity distribution of known structure can be accurately represented by a numerical model, and detailed measurements can be compared with simulated concentrations and total flow through the tank. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation given the different methods of simulating solute transport. The breakthrough curves show that simulated peak concentrations, even at very fine grid spacings, varied between the techniques because of different amounts of numerical dispersion. Sensitivity-analysis results revealed: (1) a high correlation between hydraulic conductivity and porosity given the concentration and flow observations used, so that both could not be estimated; and (2) that the breakthrough curve data did not provide enough information to estimate individual values of dispersivity for the five sands. This study demonstrates that the choice of assigned dispersivity and the amount of numerical dispersion present in the solution technique influence estimated hydraulic conductivity values to a surprising degree.
Discrete-event simulation of a wide-area health care network.
McDaniel, J G
1995-01-01
OBJECTIVE: Predict the behavior and estimate the telecommunication cost of a wide-area message store-and-forward network for health care providers that uses the telephone system. DESIGN: A tool with which to perform large-scale discrete-event simulations was developed. Network models for star and mesh topologies were constructed to analyze the differences in performances and telecommunication costs. The distribution of nodes in the network models approximates the distribution of physicians, hospitals, medical labs, and insurers in the Province of Saskatchewan, Canada. Modeling parameters were based on measurements taken from a prototype telephone network and a survey conducted at two medical clinics. Simulation studies were conducted for both topologies. RESULTS: For either topology, the telecommunication cost of a network in Saskatchewan is projected to be less than $100 (Canadian) per month per node. The estimated telecommunication cost of the star topology is approximately half that of the mesh. Simulations predict that a mean end-to-end message delivery time of two hours or less is achievable at this cost. A doubling of the data volume results in an increase of less than 50% in the mean end-to-end message transfer time. CONCLUSION: The simulation models provided an estimate of network performance and telecommunication cost in a specific Canadian province. At the expected operating point, network performance appeared to be relatively insensitive to increases in data volume. Similar results might be anticipated in other rural states and provinces in North America where a telephone-based network is desired. PMID:7583646
NASA Astrophysics Data System (ADS)
Diaconescu, Emilia Paula; Mailhot, Alain; Brown, Ross; Chaumont, Diane
2018-03-01
This study focuses on the evaluation of daily precipitation and temperature climate indices and extremes simulated by an ensemble of 12 Regional Climate Model (RCM) simulations from the ARCTIC-CORDEX experiment with surface observations in the Canadian Arctic from the Adjusted Historical Canadian Climate Dataset. Five global reanalyses products (ERA-Interim, JRA55, MERRA, CFSR and GMFD) are also included in the evaluation to assess their potential for RCM evaluation in data sparse regions. The study evaluated the means and annual anomaly distributions of indices over the 1980-2004 dataset overlap period. The results showed that RCM and reanalysis performance varied with the climate variables being evaluated. Most RCMs and reanalyses were able to simulate well climate indices related to mean air temperature and hot extremes over most of the Canadian Arctic, with the exception of the Yukon region where models displayed the largest biases related to topographic effects. Overall performance was generally poor for indices related to cold extremes. Likewise, only a few RCM simulations and reanalyses were able to provide realistic simulations of precipitation extreme indicators. The multi-reanalysis ensemble provided superior results to individual datasets for climate indicators related to mean air temperature and hot extremes, but not for other indicators. These results support the use of reanalyses as reference datasets for the evaluation of RCM mean air temperature and hot extremes over northern Canada, but not for cold extremes and precipitation indices.
An NCME Instructional Module on Subscores
ERIC Educational Resources Information Center
Sinharay, Sandip; Puhan, Gautam; Haberman, Shelby J.
2011-01-01
The purpose of this ITEMS module is to provide an introduction to subscores. First, examples of subscores from an operational test are provided. Then, a review of methods that can be used to examine if subscores have adequate psychometric quality is provided. It is demonstrated, using results from operational and simulated data, that subscores…
NASA Technical Reports Server (NTRS)
Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Kapernick, Richard
2007-01-01
Non-nuclear testing can be a valuable tool in the development of a space nuclear power system, providing system characterization data and allowing one to work through various fabrication, assembly and integration issues without the cost and time associated with a full ground nuclear test. In a non-nuclear test bed, electric heaters are used to simulate the heat from nuclear fuel. Testing with non-optimized heater elements allows one to assess thermal, heat transfer. and stress related attributes of a given system, but fails to demonstrate the dynamic response that would be present in an integrated, fueled reactor system. High fidelity thermal simulators that match both the static and the dynamic fuel pin performance that would be observed in an operating, fueled nuclear reactor can vastly increase the value of non-nuclear test results. With optimized simulators, the integration of thermal hydraulic hardware tests with simulated neutronic response provides a bridge between electrically heated testing and fueled nuclear testing. By implementing a neutronic response model to simulate the dynamic response that would be expected in a fueled reactor system, one can better understand system integration issues, characterize integrated system response times and response characteristics and assess potential design improvements at relatively small fiscal investment. Initial conceptual thermal simulator designs are determined by simple one-dimensional analysis at a single axial location and at steady state conditions; feasible concepts are then input into a detailed three-dimensional model for comparison to expected fuel pin performance. Static and dynamic fuel pin performance for a proposed reactor design is determined using SINDA/FLUINT thermal analysis software, and comparison is made between the expected nuclear performance and the performance of conceptual thermal simulator designs. Through a series of iterative analyses, a conceptual high fidelity design is developed: this is followed by engineering design, fabrication, and testing to validate the overall design process. Test results presented in this paper correspond to a "first cut" simulator design for a potential liquid metal (NaK) cooled reactor design that could be applied for Lunar surface power. Proposed refinements to this simulator design are also presented.
A Process for Comparing Dynamics of Distributed Space Systems Simulations
NASA Technical Reports Server (NTRS)
Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.
2009-01-01
The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.
Mihaljevic, Susan E; Howard, Valerie M
2016-01-01
Improving resident safety and quality of care by maximizing interdisciplinary communication among long-term care providers is essential in meeting the goals of the United States' Federal Health care reform. The new Triple Aim goals focus on improved patient outcomes, increasing patient satisfaction, and decreased health care costs, thus providing consumers with quality, efficient patient-focused care. Within the United States, sepsis is the 10th leading cause of death with a 28.6% mortality rate in the elderly, increasing to 40% to 60% in septic shock. As a result of the Affordable Care Act, the Centers for Medicare & Medicaid services supported the Interventions to Reduce Acute Care Transfers 3.0 program to improve health care quality and prevent avoidable rehospitalization by improving assessment, documentation, and communication among health care providers. The Interventions to Reduce Acute Care Transfers 3.0 tools were incorporated in interprofessional sepsis simulations throughout 19 long-term care facilities to encourage the early recognition of sepsis symptoms and prompt communication of sepsis symptoms among interdisciplinary teams. As a result of this simulation training, many long-term care organizations have adopted the STOP and WATCH and SBAR tools as a venue to communicate resident condition changes.
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diakov, Victor; Cole, Wesley; Sullivan, Patrick
2015-11-01
Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validitymore » of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.« less
Wind Shear/Turbulence Inputs to Flight Simulation and Systems Certification
NASA Technical Reports Server (NTRS)
Bowles, Roland L. (Editor); Frost, Walter (Editor)
1987-01-01
The purpose of the workshop was to provide a forum for industry, universities, and government to assess current status and likely future requirements for application of flight simulators to aviation safety concerns and system certification issues associated with wind shear and atmospheric turbulence. Research findings presented included characterization of wind shear and turbulence hazards based on modeling efforts and quantitative results obtained from field measurement programs. Future research thrusts needed to maximally exploit flight simulators for aviation safety application involving wind shear and turbulence were identified. The conference contained sessions on: Existing wind shear data and simulator implementation initiatives; Invited papers regarding wind shear and turbulence simulation requirements; and Committee working session reports.
The Application of Neutron Transport Green's Functions to Threat Scenario Simulation
NASA Astrophysics Data System (ADS)
Thoreson, Gregory G.; Schneider, Erich A.; Armstrong, Hirotatsu; van der Hoeven, Christopher A.
2015-02-01
Radiation detectors provide deterrence and defense against nuclear smuggling attempts by scanning vehicles, ships, and pedestrians for radioactive material. Understanding detector performance is crucial to developing novel technologies, architectures, and alarm algorithms. Detection can be modeled through radiation transport simulations; however, modeling a spanning set of threat scenarios over the full transport phase-space is computationally challenging. Previous research has demonstrated Green's functions can simulate photon detector signals by decomposing the scenario space into independently simulated submodels. This paper presents decomposition methods for neutron and time-dependent transport. As a result, neutron detector signals produced from full forward transport simulations can be efficiently reconstructed by sequential application of submodel response functions.
Simulating Terrestrial Gamma-ray Flashes using SWORD (Invited)
NASA Astrophysics Data System (ADS)
Gwon, C.; Grove, J.; Dwyer, J. R.; Mattson, K.; Polaski, D.; Jackson, L.
2013-12-01
We report on simulations of the relativistic feedback discharges involved with the production of terrestrial gamma-ray flashes (TGFs). The simulations were conducted using Geant4 using the SoftWare for the Optimization of Radiation Detectors (SWORD) framework. SWORD provides a graphical interface for setting up simulations in select high-energy radiation transport engines. Using Geant4, we determine avalanche length, the energy spectrum of the electrons and gamma-rays as they leave the field region, and the feedback factor describing the degree to which the production of energetic particles is self-sustaining. We validate our simulations against previous work in order to determine the reliability of our results. This work is funded by the Office of Naval Research.
NASA Astrophysics Data System (ADS)
Ji, Pengfei; Zhang, Yuwen
2016-03-01
On the basis of ab initio quantum mechanics (QM) calculation, the obtained electron heat capacity is implemented into energy equation of electron subsystem in two temperature model (TTM). Upon laser irradiation on the copper film, energy transfer from the electron subsystem to the lattice subsystem is modeled by including the electron-phonon coupling factor in molecular dynamics (MD) and TTM coupled simulation. The results show temperature and thermal melting difference between the QM-MD-TTM integrated simulation and pure MD-TTM coupled simulation. The successful construction of the QM-MD-TTM integrated simulation provides a general way that is accessible to other metals in laser heating.
LOS selective fading and AN/FRC-170(V) radio hybrid computer simulation phase A report
NASA Astrophysics Data System (ADS)
Klukis, M. K.; Lyon, T. I.; Walker, R.
1981-09-01
This report documents results of the first phase of modeling, simulation and study of the dual diversity AN/FRC-170(V) radio and frequency selective fading line of sight channel. Both hybrid computer and circuit technologies were used to develop a fast, accurate and flexible simulation tool to investigate changes and proposed improvements to the design of the AN/FRC-170(V) radio. In addition to the simulation study, a remote hybrid computer terminal was provided to DCEC for interactive study of the modeled radio and channel. Simulated performance of the radio for Rayleigh, line of sight two ray channels, and additive noise are included in the report.
Reliability of regional climate simulations
NASA Astrophysics Data System (ADS)
Ahrens, W.; Block, A.; Böhm, U.; Hauffe, D.; Keuler, K.; Kücken, M.; Nocke, Th.
2003-04-01
Quantification of uncertainty becomes more and more a key issue for assessing the trustability of future climate scenarios. In addition to the mean conditions, climate impact modelers focus in particular on extremes. Before generating such scenarios using e.g. dynamic regional climate models, a careful validation of present-day simulations should be performed to determine the range of errors for the quantities of interest under recent conditions as a raw estimate of their uncertainty in the future. Often, multiple aspects shall be covered together, and the required simulation accuracy depends on the user's demand. In our approach, a massive parallel regional climate model shall be used on the one hand to generate "long-term" high-resolution climate scenarios for several decades, and on the other hand to provide very high-resolution ensemble simulations of future dry spells or heavy rainfall events. To diagnosis the model's performance for present-day simulations, we have recently developed and tested a first version of a validation and visualization chain for this model. It is, however, applicable in a much more general sense and could be used as a common test bed for any regional climate model aiming at this type of simulations. Depending on the user's interest, integrated quality measures can be derived for near-surface parameters using multivariate techniques and multidimensional distance measures in a first step. At this point, advanced visualization techniques have been developed and included to allow for visual data mining and to qualitatively identify dominating aspects and regularities. Univariate techniques that are especially designed to assess climatic aspects in terms of statistical properties can then be used to quantitatively diagnose the error contributions of the individual used parameters. Finally, a comprehensive in-depth diagnosis tool allows to investigate, why the model produces the obtained near-surface results to answer the question if the model performs well from the modeler's point of view. Examples will be presented for results obtained using this approach for assessing the risk of potential total agricultural yield loss under drought conditions in Northeast Brazil and for evaluating simulation results for a 10-year period for Europe. To support multi-run simulations and result evaluation, the model will be embedded into an already existing simulation environment that provides further postprocessing tools for sensitivity studies, behavioral analysis and Monte-Carlo simulations, but also for ensemble scenario analysis in one of the next steps.
Glick, Joshua; Lehman, Erik; Terndrup, Thomas
2014-03-01
Coordination of the tasks of performing chest compressions and defibrillation can lead to communication challenges that may prolong time spent off the chest. The purpose of this study was to determine whether defibrillation provided by the provider performing chest compressions led to a decrease in peri-shock pauses as compared to defibrillation administered by a second provider, in a simulated cardiac arrest scenario. This was a randomized, controlled study measuring pauses in chest compressions for defibrillation in a simulated cardiac arrest model. We approached hospital providers with current CPR certification for participation between July, 2011 and October, 2011. Volunteers were randomized to control (facilitator-administered defibrillation) or experimental (compressor-administered defibrillation) groups. All participants completed one minute of chest compressions on a mannequin in a shockable rhythm prior to administration of defibrillation. We measured and compared pauses for defibrillation in both groups. Out of 200 total participants, we analyzed data from 197 defibrillations. Compressor-initiated defibrillation resulted in a significantly lower pre-shock hands-off time (0.57 s; 95% CI: 0.47-0.67) compared to facilitator-initiated defibrillation (1.49 s; 95% CI: 1.35-1.64). Furthermore, compressor-initiated defibrillation resulted in a significantly lower peri-shock hands-off time (2.77 s; 95% CI: 2.58-2.95) compared to facilitator-initiated defibrillation (4.25 s; 95% CI: 4.08-4.43). Assigning the responsibility for shock delivery to the provider performing compressions encourages continuous compressions throughout the charging period and decreases total time spent off the chest. However, as this was a simulation-based study, clinical implementation is necessary to further evaluate these potential benefits.
Anesthetics mechanism on a DMPC lipid membrane model: Insights from molecular dynamics simulations.
Saeedi, Marzieh; Lyubartsev, Alexander P; Jalili, Seifollah
2017-07-01
To provide insight into the molecular mechanisms of local anesthetic action, we have carried out an extensive investigation of two amide type local anesthetics, lidocaine and articaine in both charged and uncharged forms, interacting with DMPC lipid membrane. We have applied both standard molecular dynamics simulations and metadynamics simulations to provide a detailed description of the free energy landscape of anesthetics embedded in the lipid bilayer. The global minimum of the free energy surface (equilibrium position of anesthetics in the lipid membrane) occurred around 1nm of the bilayer center. The uncharged anesthetics show more affinity to bind to this region compared to the charged drugs. The binding free energy of uncharged lidocaine in the membrane (-30.3kJ/mol) is higher than uncharged articaine (-24.0kJ/mol), which is in good agreement with higher lipid solubility of lidocaine relative to the articaine. The octanol/water partition coefficient of uncharged drugs was also investigated using expanded ensemble simulations. In addition, complementary standard MD simulations were carried out to study the partitioning behavior of multiple anesthetics inside the lipid bilayer. The results obtained here are in line with previously reported simulations and suggest that the different forms of anesthetics induce different structural modifications in the lipid bilayer, which can provide new insights into their complex membrane translocation phenomena. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gonczi, Amanda L.; Chiu, Jennifer L.; Maeng, Jennifer L.; Bell, Randy L.
2016-07-01
This investigation sought to identify patterns in elementary science teachers' computer simulation use, particularly implementation structures and instructional supports commonly employed by teachers. Data included video-recorded science lessons of 96 elementary teachers who used computer simulations in one or more science lessons. Results indicated teachers used a one-to-one student-to-computer ratio most often either during class-wide individual computer use or during a rotating station structure. Worksheets, general support, and peer collaboration were the most common forms of instructional support. The least common instructional support forms included lesson pacing, initial play, and a closure discussion. Students' simulation use was supported in the fewest ways during a rotating station structure. Results suggest that simulation professional development with elementary teachers needs to explicitly focus on implementation structures and instructional support to enhance participants' pedagogical knowledge and improve instructional simulation use. In addition, research is needed to provide theoretical explanations for the observed patterns that should subsequently be addressed in supporting teachers' instructional simulation use during professional development or in teacher preparation programs.
Spectrum simulation in DTSA-II.
Ritchie, Nicholas W M
2009-10-01
Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.
Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design
NASA Technical Reports Server (NTRS)
Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.
2000-01-01
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design
NASA Technical Reports Server (NTRS)
Pritchett, Amy R.
2002-01-01
While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).
Mental simulation of routes during navigation involves adaptive temporal compression
Arnold, Aiden E.G.F.; Iaria, Giuseppe; Ekstrom, Arne D.
2016-01-01
Mental simulation is a hallmark feature of human cognition, allowing features from memories to be flexibly used during prospection. While past studies demonstrate the preservation of real-world features such as size and distance during mental simulation, their temporal dynamics remains unknown. Here, we compare mental simulations to navigation of routes in a large-scale spatial environment to test the hypothesis that such simulations are temporally compressed in an adaptive manner. Our results show that simulations occurred at 2.39x the speed it took to navigate a route, increasing in compression (3.57x) for slower movement speeds. Participant self-reports of vividness and spatial coherence of simulations also correlated strongly with simulation duration, providing an important link between subjective experiences of simulated events and how spatial representations are combined during prospection. These findings suggest that simulation of spatial events involve adaptive temporal mechanisms, mediated partly by the fidelity of memories used to generate the simulation. PMID:27568586
Durham extremely large telescope adaptive optics simulation platform.
Basden, Alastair; Butterley, Timothy; Myers, Richard; Wilson, Richard
2007-03-01
Adaptive optics systems are essential on all large telescopes for which image quality is important. These are complex systems with many design parameters requiring optimization before good performance can be achieved. The simulation of adaptive optics systems is therefore necessary to categorize the expected performance. We describe an adaptive optics simulation platform, developed at Durham University, which can be used to simulate adaptive optics systems on the largest proposed future extremely large telescopes as well as on current systems. This platform is modular, object oriented, and has the benefit of hardware application acceleration that can be used to improve the simulation performance, essential for ensuring that the run time of a given simulation is acceptable. The simulation platform described here can be highly parallelized using parallelization techniques suited for adaptive optics simulation, while still offering the user complete control while the simulation is running. The results from the simulation of a ground layer adaptive optics system are provided as an example to demonstrate the flexibility of this simulation platform.
Open-source framework for power system transmission and distribution dynamics co-simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Renke; Fan, Rui; Daily, Jeff
The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Setiani, Tia Dwi, E-mail: tiadwisetiani@gmail.com; Suprijadi; Nuclear Physics and Biophysics Reaserch Division, Faculty of Mathematics and Natural Sciences, Institut Teknologi Bandung Jalan Ganesha 10 Bandung, 40132
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic imagesmore » and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 – 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 10{sup 8} and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.« less
Evaluation of the Inertial Response of Variable-Speed Wind Turbines Using Advanced Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholbrock, Andrew K; Muljadi, Eduard; Gevorgian, Vahan
In this paper, we focus on the temporary frequency support effect provided by wind turbine generators (WTGs) through the inertial response. With the implemented inertial control methods, the WTG is capable of increasing its active power output by releasing parts of the stored kinetic energy when the frequency excursion occurs. The active power can be boosted temporarily above the maximum power points, but the rotor speed deceleration follows and an active power output deficiency occurs during the restoration of rotor kinetic energy. We evaluate and compare the inertial response induced by two distinct inertial control methods using advanced simulation. Inmore » the first stage, the proposed inertial control methods are analyzed in offline simulation. Using an advanced wind turbine simulation program, FAST with TurbSim, the response of the researched wind turbine is comprehensively evaluated under turbulent wind conditions, and the impact on the turbine mechanical components are assessed. In the second stage, the inertial control is deployed on a real 600kW wind turbine - Controls Advanced Research Turbine, 3-bladed (CART3), which further verifies the inertial control through a hardware-in-the-loop (HIL) simulation. Various inertial control methods can be effectively evaluated based on the proposed two-stage simulation platform, which combines the offline simulation and real-time HIL simulation. The simulation results also provide insights in designing inertial control for WTGs.« less
Conducting multicenter research in healthcare simulation: Lessons learned from the INSPIRE network.
Cheng, Adam; Kessler, David; Mackinnon, Ralph; Chang, Todd P; Nadkarni, Vinay M; Hunt, Elizabeth A; Duval-Arnould, Jordan; Lin, Yiqun; Pusic, Martin; Auerbach, Marc
2017-01-01
Simulation-based research has grown substantially over the past two decades; however, relatively few published simulation studies are multicenter in nature. Multicenter research confers many distinct advantages over single-center studies, including larger sample sizes for more generalizable findings, sharing resources amongst collaborative sites, and promoting networking. Well-executed multicenter studies are more likely to improve provider performance and/or have a positive impact on patient outcomes. In this manuscript, we offer a step-by-step guide to conducting multicenter, simulation-based research based upon our collective experience with the International Network for Simulation-based Pediatric Innovation, Research and Education (INSPIRE). Like multicenter clinical research, simulation-based multicenter research can be divided into four distinct phases. Each phase has specific differences when applied to simulation research: (1) Planning phase , to define the research question, systematically review the literature, identify outcome measures, and conduct pilot studies to ensure feasibility and estimate power; (2) Project Development phase , when the primary investigator identifies collaborators, develops the protocol and research operations manual, prepares grant applications, obtains ethical approval and executes subsite contracts, registers the study in a clinical trial registry, forms a manuscript oversight committee, and conducts feasibility testing and data validation at each site; (3) Study Execution phase , involving recruitment and enrollment of subjects, clear communication and decision-making, quality assurance measures and data abstraction, validation, and analysis; and (4) Dissemination phase , where the research team shares results via conference presentations, publications, traditional media, social media, and implements strategies for translating results to practice. With this manuscript, we provide a guide to conducting quantitative multicenter research with a focus on simulation-specific issues.
Small Wind Research Turbine: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corbus, D.; Meadors, M.
2005-10-01
The Small Wind Research Turbine (SWRT) project was initiated to provide reliable test data for model validation of furling wind turbines and to help understand small wind turbine loads. This report will familiarize the user with the scope of the SWRT test and support the use of these data. In addition to describing all the testing details and results, the report presents an analysis of the test data and compares the SWRT test data to simulation results from the FAST aeroelastic simulation model.
Baseline performance of solar collectors for NASA Langley solar building test facility
NASA Technical Reports Server (NTRS)
Knoll, R. H.; Johnson, S. M.
1977-01-01
The solar collector field contains seven collector designs. Before operation in the field, the experimental performances (thermal efficiencies) of the seven collector designs were measured in an indoor solar simulator. The resulting data provided a baseline for later comparison with actual field test data. The simulator test results are presented for the collectors as received, and after several weeks of outdoor exposure with no coolant (dry operation). Six of the seven collector designs tested showed substantial reductions in thermal efficiency after dry operation.