Sample records for simulated high level

  1. Vectorized algorithms for spiking neural network simulation.

    PubMed

    Brette, Romain; Goodman, Dan F M

    2011-06-01

    High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.

  2. Low Thrust Orbital Maneuvers Using Ion Propulsion

    NASA Astrophysics Data System (ADS)

    Ramesh, Eric

    2011-10-01

    Low-thrust maneuver options, such as electric propulsion, offer specific challenges within mission-level Modeling, Simulation, and Analysis (MS&A) tools. This project seeks to transition techniques for simulating low-thrust maneuvers from detailed engineering level simulations such as AGI's Satellite ToolKit (STK) Astrogator to mission level simulations such as the System Effectiveness Analysis Simulation (SEAS). Our project goals are as follows: A) Assess different low-thrust options to achieve various orbital changes; B) Compare such approaches to more conventional, high-thrust profiles; C) Compare computational cost and accuracy of various approaches to calculate and simulate low-thrust maneuvers; D) Recommend methods for implementing low-thrust maneuvers in high-level mission simulations; E) prototype recommended solutions.

  3. Testing the high turbulence level breakdown of low-frequency gyrokinetics against high-frequency cyclokinetic simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Zhao, E-mail: zhao.deng@foxmail.com; Waltz, R. E.

    2015-05-15

    This paper presents numerical simulations of the nonlinear cyclokinetic equations in the cyclotron harmonic representation [R. E. Waltz and Zhao Deng, Phys. Plasmas 20, 012507 (2013)]. Simulations are done with a local flux-tube geometry and with the parallel motion and variation suppressed using a newly developed rCYCLO code. Cyclokinetic simulations dynamically follow the high-frequency ion gyro-phase motion which is nonlinearly coupled into the low-frequency drift-waves possibly interrupting and suppressing gyro-averaging and increasing the transport over gyrokinetic levels. By comparing the more fundamental cyclokinetic simulations with the corresponding gyrokinetic simulations, the breakdown of gyrokinetics at high turbulence levels is quantitatively testedmore » over a range of relative ion cyclotron frequency 10 < Ω*{sup  }< 100 where Ω*{sup  }= 1/ρ*, and ρ* is the relative ion gyroradius. The gyrokinetic linear mode rates closely match the cyclokinetic low-frequency rates for Ω*{sup  }> 5. Gyrokinetic transport recovers cyclokinetic transport at high relative ion cyclotron frequency (Ω*{sup  }≥ 50) and low turbulence level as required. Cyclokinetic transport is found to be lower than gyrokinetic transport at high turbulence levels and low-Ω* values with stable ion cyclotron (IC) modes. The gyrokinetic approximation is found to break down when the density perturbations exceed 20%. For cyclokinetic simulations with sufficiently unstable IC modes and sufficiently low Ω*{sup  }∼ 10, the high-frequency component of cyclokinetic transport level can exceed the gyrokinetic transport level. However, the low-frequency component of the cyclokinetic transport and turbulence level does not exceed that of gyrokinetics. At higher and more physically relevant Ω*{sup  }≥ 50 values and physically realistic IC driving rates, the low-frequency component of the cyclokinetic transport and turbulence level is still smaller than that of gyrokinetics. Thus, the cyclokinetic simulations do not account for the so-called “L-mode near edge short fall” seen in some low-frequency gyrokinetic transport and turbulence simulations.« less

  4. Evaluation of high-level clouds in cloud resolving model simulations with ARM and KWAJEX observations

    DOE PAGES

    Liu, Zheng; Muhlbauer, Andreas; Ackerman, Thomas

    2015-11-05

    In this paper, we evaluate high-level clouds in a cloud resolving model during two convective cases, ARM9707 and KWAJEX. The simulated joint histograms of cloud occurrence and radar reflectivity compare well with cloud radar and satellite observations when using a two-moment microphysics scheme. However, simulations performed with a single moment microphysical scheme exhibit low biases of approximately 20 dB. During convective events, two-moment microphysical overestimate the amount of high-level cloud and one-moment microphysics precipitate too readily and underestimate the amount and height of high-level cloud. For ARM9707, persistent large positive biases in high-level cloud are found, which are not sensitivemore » to changes in ice particle fall velocity and ice nuclei number concentration in the two-moment microphysics. These biases are caused by biases in large-scale forcing and maintained by the periodic lateral boundary conditions. The combined effects include significant biases in high-level cloud amount, radiation, and high sensitivity of cloud amount to nudging time scale in both convective cases. The high sensitivity of high-level cloud amount to the thermodynamic nudging time scale suggests that thermodynamic nudging can be a powerful ‘‘tuning’’ parameter for the simulated cloud and radiation but should be applied with caution. The role of the periodic lateral boundary conditions in reinforcing the biases in cloud and radiation suggests that reducing the uncertainty in the large-scale forcing in high levels is important for similar convective cases and has far reaching implications for simulating high-level clouds in super-parameterized global climate models such as the multiscale modeling framework.« less

  5. SEMICONDUCTOR INTEGRATED CIRCUITS: A quasi-3-dimensional simulation method for a high-voltage level-shifting circuit structure

    NASA Astrophysics Data System (ADS)

    Jizhi, Liu; Xingbi, Chen

    2009-12-01

    A new quasi-three-dimensional (quasi-3D) numeric simulation method for a high-voltage level-shifting circuit structure is proposed. The performances of the 3D structure are analyzed by combining some 2D device structures; the 2D devices are in two planes perpendicular to each other and to the surface of the semiconductor. In comparison with Davinci, the full 3D device simulation tool, the quasi-3D simulation method can give results for the potential and current distribution of the 3D high-voltage level-shifting circuit structure with appropriate accuracy and the total CPU time for simulation is significantly reduced. The quasi-3D simulation technique can be used in many cases with advantages such as saving computing time, making no demands on the high-end computer terminals, and being easy to operate.

  6. YASS: A System Simulator for Operating System and Computer Architecture Teaching and Learning

    ERIC Educational Resources Information Center

    Mustafa, Besim

    2013-01-01

    A highly interactive, integrated and multi-level simulator has been developed specifically to support both the teachers and the learners of modern computer technologies at undergraduate level. The simulator provides a highly visual and user configurable environment with many pedagogical features aimed at facilitating deep understanding of concepts…

  7. Simulation of fiber optic liquid level sensor demodulation system

    NASA Astrophysics Data System (ADS)

    Yi, Cong-qin; Luo, Yun; Zhang, Zheng-ping

    Measuring liquid level with high accuracy is an urgent requirement. This paper mainly focus on the demodulation system of fiber-optic liquid level sensor based on Fabry-Perot cavity, design and simulate the demodulation system by the single-chip simulation software.

  8. Simulations Build Efficacy: Empirical Results from a Four-Week Congressional Simulation

    ERIC Educational Resources Information Center

    Mariani, Mack; Glenn, Brian J.

    2014-01-01

    This article describes a four-week congressional committee simulation implemented in upper level courses on Congress and the Legislative process at two liberal arts colleges. We find that the students participating in the simulation possessed high levels of political knowledge and confidence in their political skills prior to the simulation. An…

  9. High fidelity simulations of infrared imagery with animated characters

    NASA Astrophysics Data System (ADS)

    Näsström, F.; Persson, A.; Bergström, D.; Berggren, J.; Hedström, J.; Allvar, J.; Karlsson, M.

    2012-06-01

    High fidelity simulations of IR signatures and imagery tend to be slow and do not have effective support for animation of characters. Simplified rendering methods based on computer graphics methods can be used to overcome these limitations. This paper presents a method to combine these tools and produce simulated high fidelity thermal IR data of animated people in terrain. Infrared signatures for human characters have been calculated using RadThermIR. To handle multiple character models, these calculations use a simplified material model for the anatomy and clothing. Weather and temperature conditions match the IR-texture used in the terrain model. The calculated signatures are applied to the animated 3D characters that, together with the terrain model, are used to produce high fidelity IR imagery of people or crowds. For high level animation control and crowd simulations, HLAS (High Level Animation System) has been developed. There are tools available to create and visualize skeleton based animations, but tools that allow control of the animated characters on a higher level, e.g. for crowd simulation, are usually expensive and closed source. We need the flexibility of HLAS to add animation into an HLA enabled sensor system simulation framework.

  10. A simulation framework for the CMS Track Trigger electronics

    NASA Astrophysics Data System (ADS)

    Amstutz, C.; Magazzù, G.; Weber, M.; Palla, F.

    2015-03-01

    A simulation framework has been developed to test and characterize algorithms, architectures and hardware implementations of the vastly complex CMS Track Trigger for the high luminosity upgrade of the CMS experiment at the Large Hadron Collider in Geneva. High-level SystemC models of all system components have been developed to simulate a portion of the track trigger. The simulation of the system components together with input data from physics simulations allows evaluating figures of merit, like delays or bandwidths, under realistic conditions. The use of SystemC for high-level modelling allows co-simulation with models developed in Hardware Description Languages, e.g. VHDL or Verilog. Therefore, the simulation framework can also be used as a test bench for digital modules developed for the final system.

  11. The Validity and Incremental Validity of Knowledge Tests, Low-Fidelity Simulations, and High-Fidelity Simulations for Predicting Job Performance in Advanced-Level High-Stakes Selection

    ERIC Educational Resources Information Center

    Lievens, Filip; Patterson, Fiona

    2011-01-01

    In high-stakes selection among candidates with considerable domain-specific knowledge and experience, investigations of whether high-fidelity simulations (assessment centers; ACs) have incremental validity over low-fidelity simulations (situational judgment tests; SJTs) are lacking. Therefore, this article integrates research on the validity of…

  12. High Fidelity Simulation Experience in Emergency settings: doctors and nurses satisfaction levels.

    PubMed

    Calamassi, Diletta; Nannelli, Tiziana; Guazzini, Andrea; Rasero, Laura; Bambi, Stefano

    2016-11-22

    Lots of studies describe High Fidelity Simulation (HFS) as an experience well-accepted by the learners. This study has explored doctors and nurses satisfaction levels during HFS sessions, searching the associations with the setting of simulation events (simulation center or on the field simulation). Moreover, we studied the correlation between HFS experience satisfaction levels and the socio-demographic features of the participants. Mixed method study, using the Satisfaction of High-Fidelity Simulation Experience (SESAF) questionnaire through an online survey. SESAF was administered to doctors and nurses who previously took part to HFS sessions in a simulation center or in the field. Quantitative data were analyzed through descriptive and inferential statistics methods; qualitative data was performed through the Giorgi method. 143 doctors and 94 nurses filled the questionnaire. The satisfaction level was high: on a 10 points scale, the mean score was 8.17 (SD±1.924). There was no significant difference between doctors and nurses satisfaction levels in almost all the SESAF factors. We didn't find any correlation between gender and HFS experience satisfaction levels. The knowledge of theoretical aspects of the simulated case before the HFS experience is related to a higher general satisfaction (r=0.166 p=0.05), a higher effectiveness of debriefing (r=0,143 p=0,05), and a higher professional impact (r=0.143 p=0.05). The respondents that performed a HFS on the field, were more satisfied than the others, and experienced a higher "professional impact", "clinical reasoning and self efficacy", and "team dynamics" (p< 0,01). Narrative data suggest that HFS facilitators should improve their behaviors during the debriefing. Healthcare managers should extend the HFS to all kind of healthcare workers in real clinical settings. There is the need to improve and implement the communication competences of HFS facilitators.

  13. NASA Simulation Capabilities

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon R.

    2017-01-01

    This presentation provides a high-level overview of NASA's Future ATM Concepts Evaluation Tool (FACET) with a high-level description of the system's inputs and outputs. This presentation is designed to support the joint simulations that NASA and the Chinese Aeronautical Establishment (CAE) will conduct under an existing Memorandum of Understanding.

  14. Enhancing the Simulation Speed of Sensor Network Applications by Asynchronization of Interrupt Service Routines

    PubMed Central

    Joe, Hyunwoo; Woo, Duk-Kyun; Kim, Hyungshin

    2013-01-01

    Sensor network simulations require high fidelity and timing accuracy to be used as an implementation and evaluation tool. The cycle-accurate and instruction-level simulator is the known solution for these purposes. However, this type of simulation incurs a high computation cost since it has to model not only the instruction level behavior but also the synchronization between multiple sensors for their causality. This paper presents a novel technique that exploits asynchronous simulations of interrupt service routines (ISR). We can avoid the synchronization overheads when the interrupt service routines are simulated without preemption. If the causality errors occur, we devise a rollback procedure to restore the original synchronized simulation. This concept can be extended to any instruction-level sensor network simulator. Evaluation results show our method can enhance the simulation speed up to 52% in the case of our experiments. For applications with longer interrupt service routines and smaller number of preemptions, the speedup becomes greater. In addition, our simulator is 2 to 11 times faster than the well-known sensor network simulator. PMID:23966200

  15. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  16. Numerical simulation of the processes in the normal incidence tube for high acoustic pressure levels

    NASA Astrophysics Data System (ADS)

    Fedotov, E. S.; Khramtsov, I. V.; Kustov, O. Yu.

    2016-10-01

    Numerical simulation of the acoustic processes in an impedance tube at high levels of acoustic pressure is a way to solve a problem of noise suppressing by liners. These studies used liner specimen that is one cylindrical Helmholtz resonator. The evaluation of the real and imaginary parts of the liner acoustic impedance and sound absorption coefficient was performed for sound pressure levels of 130, 140 and 150 dB. The numerical simulation used experimental data having been obtained on the impedance tube with normal incidence waves. At the first stage of the numerical simulation it was used the linearized Navier-Stokes equations, which describe well the imaginary part of the liner impedance whatever the sound pressure level. These equations were solved by finite element method in COMSOL Multiphysics program in axisymmetric formulation. At the second stage, the complete Navier-Stokes equations were solved by direct numerical simulation in ANSYS CFX in axisymmetric formulation. As the result, the acceptable agreement between numerical simulation and experiment was obtained.

  17. Paleo-environment Simulation using GIS based on Shell Mounds

    NASA Astrophysics Data System (ADS)

    Uchiyama, T.; Asanuma, I.; Harada, E.

    2016-02-01

    Paleo-coastlines are simulated using the geographic information system (GIS) based on the shell mounds as the paleo-environment in the Tsubaki-no-umi, Ocean of Camellia in Japanese, the paleo-ocean, in Japan. The shell mounds, which are introduced in the paleo-study in the class history in junior and senior high, are used to estimate the paleo-coastlines. The paleo-coastlines are simulated as the function of sea levels relative to the current sea level for 6000 to 3000 BP on the digital elevation map of the GIS. The polygon of the simulated sea level height of 10 m extracted the shell mounds during 6000 to 5500 BP as the result of the spatial operation, and exhibited the consistency with the previous studies. The simulated sea level height of 5.5 m showed the paleo-coastline during 3600 to 3220 BP, while the Tsubaki-no-Umiturned into the brackish water lake, partly isolated from the ocean. The simulation of sea levels with GIS could be implemented to the class in the junior and senior high school with minimum efforts of teachers with the available computer and software environments.

  18. Workshop on data acquisition and trigger system simulations for high energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1992-12-31

    This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- anmore » Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.« less

  19. A simple mass-conserved level set method for simulation of multiphase flows

    NASA Astrophysics Data System (ADS)

    Yuan, H.-Z.; Shu, C.; Wang, Y.; Shu, S.

    2018-04-01

    In this paper, a modified level set method is proposed for simulation of multiphase flows with large density ratio and high Reynolds number. The present method simply introduces a source or sink term into the level set equation to compensate the mass loss or offset the mass increase. The source or sink term is derived analytically by applying the mass conservation principle with the level set equation and the continuity equation of flow field. Since only a source term is introduced, the application of the present method is as simple as the original level set method, but it can guarantee the overall mass conservation. To validate the present method, the vortex flow problem is first considered. The simulation results are compared with those from the original level set method, which demonstrates that the modified level set method has the capability of accurately capturing the interface and keeping the mass conservation. Then, the proposed method is further validated by simulating the Laplace law, the merging of two bubbles, a bubble rising with high density ratio, and Rayleigh-Taylor instability with high Reynolds number. Numerical results show that the mass is a well-conserved by the present method.

  20. Piloted Simulation Study of the Effects of High-Lift Aerodynamics on the Takeoff Noise of a Representative High-Speed Civil Transport

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Riley, Donald R.; Brandon, Jay M.; Person, Lee H., Jr.; Glaab, Patricia C.

    1999-01-01

    As part of an effort between NASA and private industry to reduce airport-community noise for high-speed civil transport (HSCT) concepts, a piloted simulation study was initiated for the purpose of predicting the noise reduction benefits that could result from improved low-speed high-lift aerodynamic performance for a typical HSCT configuration during takeoff and initial climb. Flight profile and engine information from the piloted simulation were coupled with the NASA Langley Aircraft Noise Prediction Program (ANOPP) to estimate jet engine noise and to propagate the resulting source noise to ground observer stations. A baseline aircraft configuration, which also incorporated different levels of projected improvements in low-speed high-lift aerodynamic performance, was simulated to investigate effects of increased lift and lift-to-drag ratio on takeoff noise levels. Simulated takeoff flights were performed with the pilots following a specified procedure in which either a single thrust cutback was performed at selected altitudes ranging from 400 to 2000 ft, or a multiple-cutback procedure was performed where thrust was reduced by a two-step process. Results show that improved low-speed high-lift aerodynamic performance provides at least a 4 to 6 dB reduction in effective perceived noise level at the FAA downrange flyover measurement station for either cutback procedure. However, improved low-speed high-lift aerodynamic performance reduced maximum sideline noise levels only when using the multiple-cutback procedures.

  1. Using high hydraulic conductivity nodes to simulate seepage lakes

    USGS Publications Warehouse

    Anderson, Mary P.; Hunt, Randall J.; Krohelski, James T.; Chung, Kuopo

    2002-01-01

    In a typical ground water flow model, lakes are represented by specified head nodes requiring that lake levels be known a priori. To remove this limitation, previous researchers assigned high hydraulic conductivity (K) values to nodes that represent a lake, under the assumption that the simulated head at the nodes in the high-K zone accurately reflects lake level. The solution should also produce a constant water level across the lake. We developed a model of a simple hypothetical ground water/lake system to test whether solutions using high-K lake nodes are sensitive to the value of K selected to represent the lake. Results show that the larger the contrast between the K of the aquifer and the K of the lake nodes, the smaller the error tolerance required for the solution to converge. For our test problem, a contrast of three orders of magnitude produced a head difference across the lake of 0.005 m under a regional gradient of the order of 10−3 m/m, while a contrast of four orders of magnitude produced a head difference of 0.001 m. The high-K method was then used to simulate lake levels in Pretty Lake, Wisconsin. Results for both the hypothetical system and the application to Pretty Lake compared favorably with results using a lake package developed for MODFLOW (Merritt and Konikow 2000). While our results demonstrate that the high-K method accurately simulates lake levels, this method has more cumbersome postprocessing and longer run times than the same problem simulated using the lake package.

  2. The effect of role assignment in high fidelity patient simulation on nursing students: An experimental research study.

    PubMed

    Weiler, Dustin T; Gibson, Andrea L; Saleem, Jason J

    2018-04-01

    Previous studies have evaluated the effectiveness of high fidelity patient simulators (HFPS) on nursing training; however, a gap exists on the effects of role assignment on critical thinking, self-efficacy, and situation awareness skills in team-based simulation scenarios. This study aims to determine if role assignment and the involvement level related to the roles yields significant effects and differences in critical thinking, situation awareness and self-efficacy scores in team-based high-fidelity simulation scenarios. A single factorial design with five levels and random assignment was utilized. A public university-sponsored simulation center in the United States of America. A convenience sample of 69 junior-level baccalaureate nursing students was recruited for participation. Participants were randomly assigned one of five possible roles and completed pre-simulation critical thinking and self-efficacy assessments prior to the simulation beginning. Playing within their assigned roles, participants experienced post-partum hemorrhaging scenario using an HFPS. After completing the simulation, participants completed a situation awareness assessment and a post-simulation critical thinking and self-efficacy assessment. Role assignment was found to have a statistically significant effect on critical thinking skills and a statistically significant difference in various areas of self-efficacy was also noted. However, no statistical significance in situation awareness abilities was found. Results support the notion that certain roles required the participant to be more involved with the simulation scenario, which may have yielded higher critical thinking and self-efficacy scores than roles that required a lesser level of involvement. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. A highly efficient 3D level-set grain growth algorithm tailored for ccNUMA architecture

    NASA Astrophysics Data System (ADS)

    Mießen, C.; Velinov, N.; Gottstein, G.; Barrales-Mora, L. A.

    2017-12-01

    A highly efficient simulation model for 2D and 3D grain growth was developed based on the level-set method. The model introduces modern computational concepts to achieve excellent performance on parallel computer architectures. Strong scalability was measured on cache-coherent non-uniform memory access (ccNUMA) architectures. To achieve this, the proposed approach considers the application of local level-set functions at the grain level. Ideal and non-ideal grain growth was simulated in 3D with the objective to study the evolution of statistical representative volume elements in polycrystals. In addition, microstructure evolution in an anisotropic magnetic material affected by an external magnetic field was simulated.

  4. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  5. Transfection of the IHH gene into rabbit BMSCs in a simulated microgravity environment promotes chondrogenic differentiation and inhibits cartilage aging.

    PubMed

    Liu, Peng-Cheng; Liu, Kuan; Liu, Jun-Feng; Xia, Kuo; Chen, Li-Yang; Wu, Xing

    2016-09-27

    The effect of overexpressing the Indian hedgehog (IHH) gene on the chondrogenic differentiation of rabbit bone marrow-derived mesenchymal stem cells (BMSCs) was investigated in a simulated microgravity environment. An adenovirus plasmid encoding the rabbit IHH gene was constructed in vitro and transfected into rabbit BMSCs. Two large groups were used: conventional cell culture and induction model group and simulated microgravity environment group. Each large group was further divided into blank control group, GFP transfection group, and IHH transfection group. During differentiation induction, the expression levels of cartilage-related and cartilage hypertrophy-related genes and proteins in each group were determined. In the conventional model, the IHH transfection group expressed high levels of cartilage-related factors (Coll2 and ANCN) at the early stage of differentiation induction and expressed high levels of cartilage hypertrophy-related factors (Coll10, annexin 5, and ALP) at the late stage. Under the simulated microgravity environment, the IHH transfection group expressed high levels of cartilage-related factors and low levels of cartilage hypertrophy-related factors at all stages of differentiation induction. Under the simulated microgravity environment, transfection of the IHH gene into BMSCs effectively promoted the generation of cartilage and inhibited cartilage aging and osteogenesis. Therefore, this technique is suitable for cartilage tissue engineering.

  6. Argonne Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-01-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distribu...

  7. Nursing students' perceptions of high- and low-fidelity simulation used as learning methods.

    PubMed

    Tosterud, Randi; Hedelin, Birgitta; Hall-Lord, Marie Louise

    2013-07-01

    Due to the increasing focus on simulation used in nursing education, there is a need to examine how the scenarios and different simulation methods used are perceived by students. The aim of this study was to examine nursing students' perceptions of scenarios played out in different simulation methods, and whether their educational level influenced their perception. The study had a quantitative, evaluative and comparative design. The sample consisted of baccalaureate nursing students (n = 86) within various educational levels. The students were randomly divided into groups. They solved a patient case adapted to their educational level by using a high-fidelity patient simulator, a static mannequin or a paper/pencil case study. Data were collected by three instruments developed by the National League for Nursing. The results showed that the nursing students reported satisfaction with the implementation of the scenarios regardless of the simulation methods used. The findings indicated that the students who used the paper/pencil case study were the most satisfied. Moreover, educational level did not seem to influence their perceptions. Independent of educational level, the findings indicated that simulation with various degrees of fidelity could be used in nursing education. There is a need for further research to examine more closely the rationale behind the students' perception of the simulation methods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. High-Fidelity Contrast Reaction Simulation Training: Performance Comparison of Faculty, Fellows, and Residents.

    PubMed

    Pfeifer, Kyle; Staib, Lawrence; Arango, Jennifer; Kirsch, John; Arici, Mel; Kappus, Liana; Pahade, Jay

    2016-01-01

    Reactions to contrast material are uncommon in diagnostic radiology, and vary in clinical presentation from urticaria to life-threatening anaphylaxis. Prior studies have demonstrated a high error rate in contrast reaction management, with smaller studies using simulation demonstrating variable data on effectiveness. We sought to assess the effectiveness of high-fidelity simulation in teaching contrast reaction management for residents, fellows, and attendings. A 20-question multiple-choice test assessing contrast reaction knowledge, with Likert-scale questions assessing subjective comfort levels of management of contrast reactions, was created. Three simulation scenarios that represented a moderate reaction, a severe reaction, and a contrast reaction mimic were completed in a one-hour period in a simulation laboratory. All participants completed a pretest and a posttest at one month. A six-month delayed posttest was given, but was optional for all participants. A total of 150 radiologists participated (residents = 52; fellows = 24; faculty = 74) in the pretest and posttest; and 105 participants completed the delayed posttest (residents = 31; fellows = 17; faculty = 57). A statistically significant increase was found in the one-month posttest (P < .00001) and the six-month posttest scores (P < .00001) and Likert scores (P < .001) assessing comfort level in managing all contrast reactions, compared with the pretest. Test scores and comfort level for moderate and severe reactions significantly decreased at six months, compared with the one-month posttest (P < .05). High-fidelity simulation is an effective learning tool, allowing practice of "high-acuity" situation management in a nonthreatening environment; the simulation training resulted in significant improvement in test scores, as well as an increase in subjective comfort in management of reactions, across all levels of training. A six-month refresher course is suggested, to maintain knowledge and comfort level in contrast reaction management. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  9. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  10. Enabling parallel simulation of large-scale HPC network systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  11. Lead-iron phosphate glass as a containment medium for the disposal of high-level nuclear wastes

    DOEpatents

    Boatner, L.A.; Sales, B.C.

    1984-04-11

    Disclosed are lead-iron phosphate glasses containing a high level of Fe/sub 2/O/sub 3/ for use as a storage medium for high-level radioactive nuclear waste. By combining lead-iron phosphate glass with various types of simulated high-level nuclear waste

  12. A randomized controlled study of manikin simulator fidelity on neonatal resuscitation program learning outcomes.

    PubMed

    Curran, Vernon; Fleet, Lisa; White, Susan; Bessell, Clare; Deshpandey, Akhil; Drover, Anne; Hayward, Mark; Valcour, James

    2015-03-01

    The neonatal resuscitation program (NRP) has been developed to educate physicians and other health care providers about newborn resuscitation and has been shown to improve neonatal resuscitation skills. Simulation-based training is recommended as an effective modality for instructing neonatal resuscitation and both low and high-fidelity manikin simulators are used. There is limited research that has compared the effect of low and high-fidelity manikin simulators for NRP learning outcomes, and more specifically on teamwork performance and confidence. The purpose of this study was to examine the effect of using low versus high-fidelity manikin simulators in NRP instruction. A randomized posttest-only control group study design was conducted. Third year undergraduate medical students participated in NRP instruction and were assigned to an experimental group (high-fidelity manikin simulator) or control group (low-fidelity manikin simulator). Integrated skills station (megacode) performance, participant satisfaction, confidence and teamwork behaviour scores were compared between the study groups. Participants in the high-fidelity manikin simulator instructional group reported significantly higher total scores in overall satisfaction (p = 0.001) and confidence (p = 0.001). There were no significant differences in teamwork behaviour scores, as observed by two independent raters, nor differences on mandatory integrated skills station performance items at the p < 0.05 level. Medical students' reported greater satisfaction and confidence with high-fidelity manikin simulators, but did not demonstrate overall significantly improved teamwork or integrated skills station performance. Low and high-fidelity manikin simulators facilitate similar levels of objectively measured NRP outcomes for integrated skills station and teamwork performance.

  13. Comparisons of observed seasonal climate features with a winter and summer numerical simulation produced with the GLAS general circulation model

    NASA Technical Reports Server (NTRS)

    Halem, M.; Shukla, J.; Mintz, Y.; Wu, M. L.; Godbole, R.; Herman, G.; Sud, Y.

    1979-01-01

    Results are presented from numerical simulations performed with the general circulation model (GCM) for winter and summer. The monthly mean simulated fields for each integration are compared with observed geographical distributions and zonal averages. In general, the simulated sea level pressure and upper level geopotential height field agree well with the observations. Well simulated features are the winter Aleutian and Icelandic lows, the summer southwestern U.S. low, the summer and winter oceanic subtropical highs in both hemispheres, and the summer upper level Tibetan high and Atlantic ridge. The surface and upper air wind fields in the low latitudes are in good agreement with the observations. The geographical distirbutions of the Earth-atmosphere radiation balance and of the precipitation rates over the oceans are well simulated, but not all of the intensities of these features are correct. Other comparisons are shown for precipitation along the ITCZ, rediation balance, zonally averaged temperatures and zonal winds, and poleward transports of momentum and sensible heat.

  14. A One System Integrated Approach to Simulant Selection for Hanford High Level Waste Mixing and Sampling Tests - 13342

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thien, Mike G.; Barnes, Steve M.

    2013-07-01

    The Hanford Tank Operations Contractor (TOC) and the Hanford Waste Treatment and Immobilization Plant (WTP) contractor are both engaged in demonstrating mixing, sampling, and transfer system capabilities using simulated Hanford High-Level Waste (HLW) formulations. This represents one of the largest remaining technical issues with the high-level waste treatment mission at Hanford. Previous testing has focused on very specific TOC or WTP test objectives and consequently the simulants were narrowly focused on those test needs. A key attribute in the Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2010-2 is to ensure testing is performed with a simulant that represents the broadmore » spectrum of Hanford waste. The One System Integrated Project Team is a new joint TOC and WTP organization intended to ensure technical integration of specific TOC and WTP systems and testing. A new approach to simulant definition has been mutually developed that will meet both TOC and WTP test objectives for the delivery and receipt of HLW. The process used to identify critical simulant characteristics, incorporate lessons learned from previous testing, and identify specific simulant targets that ensure TOC and WTP testing addresses the broad spectrum of Hanford waste characteristics that are important to mixing, sampling, and transfer performance are described. (authors)« less

  15. Impact of High-Fidelity Simulation and Pharmacist-Specific Didactic Lectures in Addition to ACLS Provider Certification on Pharmacy Resident ACLS Performance.

    PubMed

    Bartel, Billie J

    2014-08-01

    This pilot study explored the use of multidisciplinary high-fidelity simulation and additional pharmacist-focused training methods in training postgraduate year 1 (PGY1) pharmacy residents to provide Advanced Cardiovascular Life Support (ACLS) care. Pharmacy resident confidence and comfort level were assessed after completing these training requirements. The ACLS training requirements for pharmacy residents were revised to include didactic instruction on ACLS pharmacology and rhythm recognition and participation in multidisciplinary high-fidelity simulation ACLS experiences in addition to ACLS provider certification. Surveys were administered to participating residents to assess the impact of this additional education on resident confidence and comfort level in cardiopulmonary arrest situations. The new ACLS didactic and simulation training requirements resulted in increased resident confidence and comfort level in all assessed functions. Residents felt more confident in all areas except providing recommendations for dosing and administration of medications and rhythm recognition after completing the simulation scenarios than with ACLS certification training and the didactic components alone. All residents felt the addition of lectures and simulation experiences better prepared them to function as a pharmacist in the ACLS team. Additional ACLS training requirements for pharmacy residents increased overall awareness of pharmacist roles and responsibilities and greatly improved resident confidence and comfort level in performing most essential pharmacist functions during ACLS situations. © The Author(s) 2013.

  16. RSRM top hat cover simulator lightning test, volume 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The test sequence was to measure electric and magnetic fields induced inside a redesigned solid rocket motor case when a simulated lightning discharge strikes an exposed top hat cover simulator. The test sequence was conducted between 21 June and 17 July 1990. Thirty-six high rate-of-rise Marx generator discharges and eight high current bank discharges were injected onto three different test article configurations. Attach points included three locations on the top hat cover simulator and two locations on the mounting bolts. Top hat cover simulator and mounting bolt damage and grain cover damage was observed. Overall electric field levels were well below 30 kilowatts/meter. Electric field levels ranged from 184.7 to 345.9 volts/meter and magnetic field levels were calculated from 6.921 to 39.73 amperes/meter. It is recommended that the redesigned solid rocket motor top hat cover be used in Configuration 1 or Configuration 2 as an interim lightning protection device until a lightweight cover can be designed.

  17. Process-based modelling to evaluate simulated groundwater levels and frequencies in a Chalk catchment in south-western England

    NASA Astrophysics Data System (ADS)

    Brenner, Simon; Coxon, Gemma; Howden, Nicholas J. K.; Freer, Jim; Hartmann, Andreas

    2018-02-01

    Chalk aquifers are an important source of drinking water in the UK. Due to their properties, they are particularly vulnerable to groundwater-related hazards like floods and droughts. Understanding and predicting groundwater levels is therefore important for effective and safe water management. Chalk is known for its high porosity and, due to its dissolvability, exposed to karstification and strong subsurface heterogeneity. To cope with the karstic heterogeneity and limited data availability, specialised modelling approaches are required that balance model complexity and data availability. In this study, we present a novel approach to evaluate simulated groundwater level frequencies derived from a semi-distributed karst model that represents subsurface heterogeneity by distribution functions. Simulated groundwater storages are transferred into groundwater levels using evidence from different observations wells. Using a percentile approach we can assess the number of days exceeding or falling below selected groundwater level percentiles. Firstly, we evaluate the performance of the model when simulating groundwater level time series using a spilt sample test and parameter identifiability analysis. Secondly, we apply a split sample test to the simulated groundwater level percentiles to explore the performance in predicting groundwater level exceedances. We show that the model provides robust simulations of discharge and groundwater levels at three observation wells at a test site in a chalk-dominated catchment in south-western England. The second split sample test also indicates that the percentile approach is able to reliably predict groundwater level exceedances across all considered timescales up to their 75th percentile. However, when looking at the 90th percentile, it only provides acceptable predictions for long time periods and it fails when the 95th percentile of groundwater exceedance levels is considered. By modifying the historic forcings of our model according to expected future climate changes, we create simple climate scenarios and we show that the projected climate changes may lead to generally lower groundwater levels and a reduction of exceedances of high groundwater level percentiles.

  18. Effects of Parameterized Orographic Drag on Weather Forecasting and Simulated Climatology Over East Asia During Boreal Summer

    NASA Astrophysics Data System (ADS)

    Choi, Hyun-Joo; Choi, Suk-Jin; Koo, Myung-Seo; Kim, Jung-Eun; Kwon, Young Cheol; Hong, Song-You

    2017-10-01

    The impact of subgrid orographic drag on weather forecasting and simulated climatology over East Asia in boreal summer is examined using two parameterization schemes in a global forecast model. The schemes consider gravity wave drag (GWD) with and without lower-level wave breaking drag (LLWD) and flow-blocking drag (FBD). Simulation results from sensitivity experiments verify that the scheme with LLWD and FBD improves the intensity of a summertime continental high over the northern part of the Korean Peninsula, which is exaggerated with GWD only. This is because the enhanced lower tropospheric drag due to the effects of lower-level wave breaking and flow blocking slows down the wind flowing out of the high-pressure system in the lower troposphere. It is found that the decreased lower-level divergence induces a compensating weakening of middle- to upper-level convergence aloft. Extended experiments for medium-range forecasts for July 2013 and seasonal simulations for June to August of 2013-2015 are also conducted. Statistical skill scores for medium-range forecasting are improved not only in low-level winds but also in surface pressure when both LLWD and FBD are considered. A simulated climatology of summertime monsoon circulation in East Asia is also realistically reproduced.

  19. Fast and Accurate Simulation of the Cray XMT Multithreaded Supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villa, Oreste; Tumeo, Antonino; Secchi, Simone

    Irregular applications, such as data mining and analysis or graph-based computations, show unpredictable memory/network access patterns and control structures. Highly multithreaded architectures with large processor counts, like the Cray MTA-1, MTA-2 and XMT, appear to address their requirements better than commodity clusters. However, the research on highly multithreaded systems is currently limited by the lack of adequate architectural simulation infrastructures due to issues such as size of the machines, memory footprint, simulation speed, accuracy and customization. At the same time, Shared-memory MultiProcessors (SMPs) with multi-core processors have become an attractive platform to simulate large scale machines. In this paper, wemore » introduce a cycle-level simulator of the highly multithreaded Cray XMT supercomputer. The simulator runs unmodified XMT applications. We discuss how we tackled the challenges posed by its development, detailing the techniques introduced to make the simulation as fast as possible while maintaining a high accuracy. By mapping XMT processors (ThreadStorm with 128 hardware threads) to host computing cores, the simulation speed remains constant as the number of simulated processors increases, up to the number of available host cores. The simulator supports zero-overhead switching among different accuracy levels at run-time and includes a network model that takes into account contention. On a modern 48-core SMP host, our infrastructure simulates a large set of irregular applications 500 to 2000 times slower than real time when compared to a 128-processor XMT, while remaining within 10\\% of accuracy. Emulation is only from 25 to 200 times slower than real time.« less

  20. High performance cellular level agent-based simulation with FLAME for the GPU.

    PubMed

    Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela

    2010-05-01

    Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.

  1. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms

    NASA Astrophysics Data System (ADS)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs.

  2. Summary of a Modeling and Simulation Framework for High-Fidelity Weapon Models in Joint Semi-Automated Forces (JSAF) and Other Mission-Simulation Software

    DTIC Science & Technology

    2008-05-01

    communicate with other weapon models In a mission-level simulation; (3) introduces the four configuration levels of the M&S framework; and (4) presents a cost ...and Disadvantages ....................................................................... 26 6 COST -EFFECTIVE M&S LABORATORY PLAN...25 23 Weapon Model Sample Time and Average TET Displayed on the Target PC ..... 26 24 Design and Cost of an

  3. Simulation-based intelligent robotic agent for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Biegl, Csaba A.; Springfield, James F.; Cook, George E.; Fernandez, Kenneth R.

    1990-01-01

    A robot control package is described which utilizes on-line structural simulation of robot manipulators and objects in their workspace. The model-based controller is interfaced with a high level agent-independent planner, which is responsible for the task-level planning of the robot's actions. Commands received from the agent-independent planner are refined and executed in the simulated workspace, and upon successful completion, they are transferred to the real manipulators.

  4. Operationalizing Healthcare Simulation Psychological Safety: A Descriptive Analysis of an Intervention.

    PubMed

    Henricksen, Jared W; Altenburg, Catherine; Reeder, Ron W

    2017-10-01

    Despite efforts to prepare a psychologically safe environment, simulation participants are occasionally psychologically distressed. Instructing simulation educators about participant psychological risks and having a participant psychological distress action plan available to simulation educators may assist them as they seek to keep all participants psychologically safe. A Simulation Participant Psychological Safety Algorithm was designed to aid simulation educators as they debrief simulation participants perceived to have psychological distress and categorize these events as mild (level 1), moderate (level 2), or severe (level 3). A prebrief dedicated to creating a psychologically safe learning environment was held constant. The algorithm was used for 18 months in an active pediatric simulation program. Data collected included level of participant psychological distress as perceived and categorized by the simulation team using the algorithm, type of simulation that participants went through, who debriefed, and timing of when psychological distress was perceived to occur during the simulation session. The Kruskal-Wallis test was used to evaluate the relationship between events and simulation type, events and simulation educator team who debriefed, and timing of event during the simulation session. A total of 3900 participants went through 399 simulation sessions between August 1, 2014, and January 26, 2016. Thirty-four (<1%) simulation participants from 27 sessions (7%) were perceived to have an event. One participant was perceived to have a severe (level 3) psychological distress event. Events occurred more commonly in high-intensity simulations, with novice learners and with specific educator teams. Simulation type and simulation educator team were associated with occurrence of events (P < 0.001). There was no association between event timing and event level. Severe psychological distress as categorized by simulation personnel using the Simulation Participant Psychological Safety Algorithm is rare, with mild and moderate events being more common. The algorithm was used to teach simulation educators how to assist a participant who may be psychologically distressed and document perceived event severity.

  5. Alexander Meets Michotte: A Simulation Tool Based on Pattern Programming and Phenomenology

    ERIC Educational Resources Information Center

    Basawapatna, Ashok

    2016-01-01

    Simulation and modeling activities, a key point of computational thinking, are currently not being integrated into the science classroom. This paper describes a new visual programming tool entitled the Simulation Creation Toolkit. The Simulation Creation Toolkit is a high level pattern-based phenomenological approach to bringing rapid simulation…

  6. Development of a High Level Architecture Federation of Ship Replenishment at Sea

    DTIC Science & Technology

    2011-10-01

    utiliser une infrastructure de simulation appelée architecture de haut niveau (HLA) afin de fournir des environne - ments de simulation interarmées...fournir un environnement de simulation qui modélise l’interactions entre les divers composants afin de simuler les conditions qui mènent aux

  7. The Integrated Plasma Simulator: A Flexible Python Framework for Coupled Multiphysics Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, Samantha S; Elwasif, Wael R; Bernholdt, David E

    2011-11-01

    High-fidelity coupled multiphysics simulations are an increasingly important aspect of computational science. In many domains, however, there has been very limited experience with simulations of this sort, therefore research in coupled multiphysics often requires computational frameworks with significant flexibility to respond to the changing directions of the physics and mathematics. This paper presents the Integrated Plasma Simulator (IPS), a framework designed for loosely coupled simulations of fusion plasmas. The IPS provides users with a simple component architecture into which a wide range of existing plasma physics codes can be inserted as components. Simulations can take advantage of multiple levels ofmore » parallelism supported in the IPS, and can be controlled by a high-level ``driver'' component, or by other coordination mechanisms, such as an asynchronous event service. We describe the requirements and design of the framework, and how they were implemented in the Python language. We also illustrate the flexibility of the framework by providing examples of different types of simulations that utilize various features of the IPS.« less

  8. Progress in detailed modelling of low foot and high foot implosion experiments on the National Ignition Facility

    NASA Astrophysics Data System (ADS)

    Clark, D. S.; Weber, C. R.; Eder, D. C.; Haan, S. W.; Hammel, B. A.; Hinkel, D. E.; Jones, O. S.; Kritcher, A. L.; Marinak, M. M.; Milovich, J. L.; Patel, P. K.; Robey, H. F.; Salmonson, J. D.; Sepke, S. M.

    2016-05-01

    Several dozen high convergence inertial confinement fusion ignition experiments have now been completed on the National Ignition Facility (NIF). These include both “low foot” experiments from the National Ignition Campaign (NIC) and more recent “high foot” experiments. At the time of the NIC, there were large discrepancies between simulated implosion performance and experimental data. In particular, simulations over predicted neutron yields by up to an order of magnitude, and some experiments showed clear evidence of mixing of ablator material deep into the hot spot that could not be explained at the time. While the agreement between data and simulation improved for high foot implosion experiments, discrepancies nevertheless remain. This paper describes the state of detailed modelling of both low foot and high foot implosions using 1-D, 2-D, and 3-D radiation hydrodynamics simulations with HYDRA. The simulations include a range of effects, in particular, the impact of the plastic membrane used to support the capsule in the hohlraum, as well as low-mode radiation asymmetries tuned to match radiography measurements. The same simulation methodology is applied to low foot NIC implosion experiments and high foot implosions, and shows a qualitatively similar level of agreement for both types of implosions. While comparison with the experimental data remains imperfect, a reasonable level of agreement is emerging and shows a growing understanding of the high-convergence implosions being performed on NIF.

  9. Numerical Modeling of Thermal-Hydrology in the Near Field of a Generic High-Level Waste Repository

    NASA Astrophysics Data System (ADS)

    Matteo, E. N.; Hadgu, T.; Park, H.

    2016-12-01

    Disposal in a deep geologic repository is one of the preferred option for long term isolation of high-level nuclear waste. Coupled thermal-hydrologic processes induced by decay heat from the radioactive waste may impact fluid flow and the associated migration of radionuclides. This study looked at the effects of those processes in simulations of thermal-hydrology for the emplacement of U. S. Department of Energy managed high-level waste and spent nuclear fuel. Most of the high-level waste sources have lower thermal output which would reduce the impact of thermal propagation. In order to quantify the thermal limits this study concentrated on the higher thermal output sources and on spent nuclear fuel. The study assumed a generic nuclear waste repository at 500 m depth. For the modeling a representative domain was selected representing a portion of the repository layout in order to conduct a detailed thermal analysis. A highly refined unstructured mesh was utilized with refinements near heat sources and at intersections of different materials. Simulations looked at different values for properties of components of the engineered barrier system (i.e. buffer, disturbed rock zone and the host rock). The simulations also looked at the effects of different durations of surface aging of the waste to reduce thermal perturbations. The PFLOTRAN code (Hammond et al., 2014) was used for the simulations. Modeling results for the different options are reported and include temperature and fluid flow profiles in the near field at different simulation times. References:G. E. Hammond, P.C. Lichtner and R.T. Mills, "Evaluating the Performance of Parallel Subsurface Simulators: An Illustrative Example with PFLOTRAN", Water Resources Research, 50, doi:10.1002/2012WR013483 (2014). Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND2016-7510 A

  10. Numerical Simulation of Rolling-Airframes Using a Multi-Level Cartesian Method

    NASA Technical Reports Server (NTRS)

    Murman, Scott M.; Aftosmis, Michael J.; Berger, Marsha J.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A supersonic rolling missile with two synchronous canard control surfaces is analyzed using an automated, inviscid, Cartesian method. Sequential-static and time-dependent dynamic simulations of the complete motion are computed for canard dither schedules for level flight, pitch, and yaw maneuver. The dynamic simulations are compared directly against both high-resolution viscous simulations and relevant experimental data, and are also utilized to compute dynamic stability derivatives. The results show that both the body roll rate and canard dither motion influence the roll-averaged forces and moments on the body. At the relatively, low roll rates analyzed in the current work these dynamic effects are modest, however the dynamic computations are effective in predicting the dynamic stability derivatives which can be significant for highly-maneuverable missiles.

  11. Stress levels during emergency care: A comparison between reality and simulated scenarios.

    PubMed

    Daglius Dias, Roger; Scalabrini Neto, Augusto

    2016-06-01

    Medical simulation is fast becoming a standard of health care training throughout undergraduate, postgraduate and continuing medical education. Our aim was to evaluate if simulated scenarios have a high psychological fidelity and induce stress levels similarly to real emergency medical situations. Medical residents had their stress levels measured during emergency care (real-life and simulation) in baseline (T1) and immediately post-emergencies (T2). Parameters measuring acute stress were: heart rate, systolic and diastolic blood pressure, salivary α-amylase, salivary interleukin-1β, and State-Trait Anxiety Inventory score. Twenty-eight internal medicine residents participated in 32 emergency situations (16 real-life and 16 simulated emergencies). In the real-life group, all parameters increased significantly (P < .05) between T1 and T2. In the simulation group, only heart rate and interleukin-1β increased significantly after emergencies. The comparison between groups demonstrates that acute stress response (T2 - T1) and State-Trait Anxiety Inventory score (in T2) did not differ between groups. Acute stress response did not differ between both groups. Our results indicate that emergency medicine simulation may create a high psychological fidelity environment similarly to what is observed in a real emergency room. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Competency: Does High Fidelity Simulation Make a Difference?

    ERIC Educational Resources Information Center

    Valente, Alice M.

    2010-01-01

    High fidelity simulation is a well documented adjunctive teaching method in medical and nurse practitioner programs, but few studies of effectiveness on this technology on the development of competency have emphasized pre-licensure associate degree level programs. This study explored student competency in the application of the nursing process…

  13. Transfer of training from a Full-Flight Simulator vs. a high level flight training device with a dynamic seat

    DOT National Transportation Integrated Search

    2010-08-02

    This paper summarizes the most recent study conducted by the Federal Administration Administration/Volpe Center Flight Simulator Fidelity Requirements Program. For many smaller airlines, access to qualified simulators is limited due to the availabili...

  14. A Model-Based Approach for Bridging Virtual and Physical Sensor Nodes in a Hybrid Simulation Framework

    PubMed Central

    Mozumdar, Mohammad; Song, Zhen Yu; Lavagno, Luciano; Sangiovanni-Vincentelli, Alberto L.

    2014-01-01

    The Model Based Design (MBD) approach is a popular trend to speed up application development of embedded systems, which uses high-level abstractions to capture functional requirements in an executable manner, and which automates implementation code generation. Wireless Sensor Networks (WSNs) are an emerging very promising application area for embedded systems. However, there is a lack of tools in this area, which would allow an application developer to model a WSN application by using high level abstractions, simulate it mapped to a multi-node scenario for functional analysis, and finally use the refined model to automatically generate code for different WSN platforms. Motivated by this idea, in this paper we present a hybrid simulation framework that not only follows the MBD approach for WSN application development, but also interconnects a simulated sub-network with a physical sub-network and then allows one to co-simulate them, which is also known as Hardware-In-the-Loop (HIL) simulation. PMID:24960083

  15. Multi-fidelity methods for uncertainty quantification in transport problems

    NASA Astrophysics Data System (ADS)

    Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.

    2016-12-01

    We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.

  16. SimGen: A General Simulation Method for Large Systems.

    PubMed

    Taylor, William R

    2017-02-03

    SimGen is a stand-alone computer program that reads a script of commands to represent complex macromolecules, including proteins and nucleic acids, in a structural hierarchy that can then be viewed using an integral graphical viewer or animated through a high-level application programming interface in C++. Structural levels in the hierarchy range from α-carbon or phosphate backbones through secondary structure to domains, molecules, and multimers with each level represented in an identical data structure that can be manipulated using the application programming interface. Unlike most coarse-grained simulation approaches, the higher-level objects represented in SimGen can be soft, allowing the lower-level objects that they contain to interact directly. The default motion simulated by SimGen is a Brownian-like diffusion that can be set to occur across all levels of representation in the hierarchy. Links can also be defined between objects, which, when combined with large high-level random movements, result in an effective search strategy for constraint satisfaction, including structure prediction from predicted pairwise distances. The implementation of SimGen makes use of the hierarchic data structure to avoid unnecessary calculation, especially for collision detection, allowing it to be simultaneously run and viewed on a laptop computer while simulating large systems of over 20,000 objects. It has been used previously to model complex molecular interactions including the motion of a myosin-V dimer "walking" on an actin fibre, RNA stem-loop packing, and the simulation of cell motion and aggregation. Several extensions to this original functionality are described. Copyright © 2016 The Francis Crick Institute. Published by Elsevier Ltd.. All rights reserved.

  17. The High-Throughput Stochastic Human Exposure and Dose Simulation Model (SHEDS-HT) & The Chemical and Products Database (CPDat)

    EPA Science Inventory

    The Stochastic Human Exposure and Dose Simulation Model – High-Throughput (SHEDS-HT) is a U.S. Environmental Protection Agency research tool for predicting screening-level (low-tier) exposures to chemicals in consumer products. This course will present an overview of this m...

  18. Monthly mean simulation experiments with a course-mesh global atmospheric model

    NASA Technical Reports Server (NTRS)

    Spar, J.; Klugman, R.; Lutz, R. J.; Notario, J. J.

    1978-01-01

    Substitution of observed monthly mean sea-surface temperatures (SSTs) as lower boundary conditions, in place of climatological SSTs, failed to improve the model simulations. While the impact of SST anomalies on the model output is greater at sea level than at upper levels the impact on the monthly mean simulations is not beneficial at any level. Shifts of one and two days in initialization time produced small, but non-trivial, changes in the model-generated monthly mean synoptic fields. No improvements in the mean simulations resulted from the use of either time-averaged initial data or re-initialization with time-averaged early model output. The noise level of the model, as determined from a multiple initial state perturbation experiment, was found to be generally low, but with a noisier response to initial state errors in high latitudes than the tropics.

  19. Notes on modeling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redondo, Antonio

    These notes present a high-level overview of how modeling and simulation are carried out by practitioners. The discussion is of a general nature; no specific techniques are examined but the activities associated with all modeling and simulation approaches are briefly addressed. There is also a discussion of validation and verification and, at the end, a section on why modeling and simulation are useful.

  20. Mission Simulation Facility: Simulation Support for Autonomy Development

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Plice, Laura; Neukom, Christian; Flueckiger, Lorenzo; Wagner, Michael

    2003-01-01

    The Mission Simulation Facility (MSF) supports research in autonomy technology for planetary exploration vehicles. Using HLA (High Level Architecture) across distributed computers, the MSF connects users autonomy algorithms with provided or third-party simulations of robotic vehicles and planetary surface environments, including onboard components and scientific instruments. Simulation fidelity is variable to meet changing needs as autonomy technology advances in Technical Readiness Level (TRL). A virtual robot operating in a virtual environment offers numerous advantages over actual hardware, including availability, simplicity, and risk mitigation. The MSF is in use by researchers at NASA Ames Research Center (ARC) and has demonstrated basic functionality. Continuing work will support the needs of a broader user base.

  1. Terascale High-Fidelity Simulations of Turbulent Combustion with Detailed Chemistry: Spray Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rutland, Christopher J.

    2009-04-26

    The Terascale High-Fidelity Simulations of Turbulent Combustion (TSTC) project is a multi-university collaborative effort to develop a high-fidelity turbulent reacting flow simulation capability utilizing terascale, massively parallel computer technology. The main paradigm of the approach is direct numerical simulation (DNS) featuring the highest temporal and spatial accuracy, allowing quantitative observations of the fine-scale physics found in turbulent reacting flows as well as providing a useful tool for development of sub-models needed in device-level simulations. Under this component of the TSTC program the simulation code named S3D, developed and shared with coworkers at Sandia National Laboratories, has been enhanced with newmore » numerical algorithms and physical models to provide predictive capabilities for turbulent liquid fuel spray dynamics. Major accomplishments include improved fundamental understanding of mixing and auto-ignition in multi-phase turbulent reactant mixtures and turbulent fuel injection spray jets.« less

  2. Triple Value Simulation Model Fact Sheet

    EPA Pesticide Factsheets

    The Triple Value Simulation (3VS) is a high-level model that accounts for the complex relationships among economic, social and environmental systems in order to explore scenarios and solutions to improve the health of the Bay.

  3. Learning Density in Vanuatu High School with Computer Simulation: Influence of Different Levels of Guidance

    ERIC Educational Resources Information Center

    Moli, Lemuel; Delserieys, Alice Pedregosa; Impedovo, Maria Antonietta; Castera, Jeremy

    2017-01-01

    This paper presents a study on discovery learning of scientific concepts with the support of computer simulation. In particular, the paper will focus on the effect of the levels of guidance on students with a low degree of experience in informatics and educational technology. The first stage of this study was to identify the common misconceptions…

  4. Simulator technology as a tool for education in cardiac care.

    PubMed

    Hravnak, Marilyn; Beach, Michael; Tuite, Patricia

    2007-01-01

    Assisting nurses in gaining the cognitive and psychomotor skills necessary to safely and effectively care for patients with cardiovascular disease can be challenging for educators. Ideally, nurses would have the opportunity to synthesize and practice these skills in a protected training environment before application in the dynamic clinical setting. Recently, a technology known as high fidelity human simulation was introduced, which permits learners to interact with a simulated patient. The dynamic physiologic parameters and physical assessment capabilities of the simulated patient provide for a realistic learning environment. This article describes the High Fidelity Human Simulation Laboratory at the University of Pittsburgh School of Nursing and presents strategies for using this technology as a tool in teaching complex cardiac nursing care at the basic and advanced practice nursing levels. The advantages and disadvantages of high fidelity human simulation in learning are discussed.

  5. High-Fidelity Three-Dimensional Simulation of the GE90

    NASA Technical Reports Server (NTRS)

    Turner, Mark G.; Norris, Andrew; Veres, Josphe P.

    2004-01-01

    A full-engine simulation of the three-dimensional flow in the GE90 94B high-bypass ratio turbofan engine has been achieved. It would take less than 11 hr of wall clock time if starting from scratch through the exploitation of parallel processing. The simulation of the compressor components, the cooled high-pressure turbine, and the low-pressure turbine was performed using the APNASA turbomachinery flow code. The combustor flow and chemistry were simulated using the National Combustor Code (NCC). The engine simulation matches the engine thermodynamic cycle for a sea-level takeoff condition. The simulation is started at the inlet of the fan and progresses downstream. Comparisons with the cycle point are presented. A detailed look at the blockage in the turbomachinery is presented as one measure to assess and view the solution and the multistage interaction effects.

  6. SimSup's Loop: A Control Theory Approach to Spacecraft Operator Training

    NASA Technical Reports Server (NTRS)

    Owens, Brandon Dewain; Crocker, Alan R.

    2015-01-01

    Immersive simulation is a staple of training for many complex system operators, including astronauts and ground operators of spacecraft. However, while much has been written about simulators, simulation facilities, and operator certification programs, the topic of how one develops simulation scenarios to train a spacecraft operator is relatively understated in the literature. In this paper, an approach is presented for using control theory as the basis for developing the immersive simulation scenarios for a spacecraft operator training program. The operator is effectively modeled as a high level controller of lower level hardware and software control loops that affect a select set of system state variables. Simulation scenarios are derived from a STAMP-based hazard analysis of the operator's high and low level control loops. The immersive simulation aspect of the overall training program is characterized by selecting a set of scenarios that expose the operator to the various inadequate control actions that stem from control flaws and inadequate control executions in the different sections of the typical control loop. Results from the application of this approach to the Lunar Atmosphere and Dust Environment Explorer (LADEE) mission are provided through an analysis of the simulation scenarios used for operator training and the actual anomalies that occurred during the mission. The simulation scenarios and inflight anomalies are mapped to specific control flaws and inadequate control executions in the different sections of the typical control loop to illustrate the characteristics of anomalies arising from the different sections of the typical control loop (and why it is important for operators to have exposure to these characteristics). Additionally, similarities between the simulation scenarios and inflight anomalies are highlighted to make the case that the simulation scenarios prepared the operators for the mission.

  7. Response of Flight Nurses in a Simulated Helicopter Environment.

    PubMed

    Kaniecki, David M; Hickman, Ronald L; Alfes, Celeste M; Reimer, Andrew P

    The purpose of this study was to determine if a helicopter flight simulator could provide a useful educational platform by creating experiences similar to those encountered by actual flight nurses. Flight nurse (FN) and non-FN participants completed a simulated emergency scenario in a flight simulator. Physiologic and psychological stress during the simulation was measured using heart rate and perceived stress scores. A questionnaire was then administered to assess the realism of the flight simulator. Subjects reported that the overall experience in the flight simulator was comparable with a real helicopter. Sounds, communications, vibrations, and movements in the simulator most approximated those of a real-life helicopter environment. Perceived stress levels of all participants increased significantly from 27 (on a 0-100 scale) before simulation to 51 at the peak of the simulation and declined thereafter to 28 (P < .001). Perceived stress levels of FNs increased significantly from 25 before simulation to 54 at the peak of the simulation and declined thereafter to 30 (P < .001). Perceived stress levels of non-FNs increased significantly from 31 before simulation to 49 at the peak of the simulation and declined thereafter to 25 (P < .001). There were no significant differences in perceived stress levels between FNs and non-FNs before (P = .58), during (P = .63), or after (P = .55) simulation. FNs' heart rates increased significantly from 77 before simulation to 100 at the peak of the simulation and declined thereafter to 72 (P < .001). The results of this study suggest that simulation of a critical care scenario in a high-fidelity helicopter flight simulator can provide a realistic helicopter transport experience and create physiologic and psychological stress for participants. Copyright © 2017 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.

  8. MATLAB/Simulink Pulse-Echo Ultrasound System Simulator Based on Experimentally Validated Models.

    PubMed

    Kim, Taehoon; Shin, Sangmin; Lee, Hyongmin; Lee, Hyunsook; Kim, Heewon; Shin, Eunhee; Kim, Suhwan

    2016-02-01

    A flexible clinical ultrasound system must operate with different transducers, which have characteristic impulse responses and widely varying impedances. The impulse response determines the shape of the high-voltage pulse that is transmitted and the specifications of the front-end electronics that receive the echo; the impedance determines the specification of the matching network through which the transducer is connected. System-level optimization of these subsystems requires accurate modeling of pulse-echo (two-way) response, which in turn demands a unified simulation of the ultrasonics and electronics. In this paper, this is realized by combining MATLAB/Simulink models of the high-voltage transmitter, the transmission interface, the acoustic subsystem which includes wave propagation and reflection, the receiving interface, and the front-end receiver. To demonstrate the effectiveness of our simulator, the models are experimentally validated by comparing the simulation results with the measured data from a commercial ultrasound system. This simulator could be used to quickly provide system-level feedback for an optimized tuning of electronic design parameters.

  9. High-fidelity nursing simulation: impact on student self-confidence and clinical competence.

    PubMed

    Blum, Cynthia A; Borglund, Susan; Parcells, Dax

    2010-01-01

    Development of safe nursing practice in entry-level nursing students requires special consideration from nurse educators. The paucity of data supporting high-fidelity patient simulation effectiveness in this population informed the development of a quasi-experimental, quantitative study of the relationship between simulation and student self-confidence and clinical competence. Moreover, the study reports a novel approach to measuring self-confidence and competence of entry-level nursing students. Fifty-three baccalaureate students, enrolled in either a traditional or simulation-enhanced laboratory, participated during their first clinical rotation. Student self-confidence and faculty perception of student clinical competence were measured using selected scale items of the Lasater Clinical Judgment Rubric. The results indicated an overall improvement in self-confidence and competence across the semester, however, simulation did not significantly enhance these caring attributes. The study highlights the need for further examination of teaching strategies developed to promote the transfer of self-confidence and competence from the laboratory to the clinical setting.

  10. An Investigation of the Impact of Aerodynamic Model Fidelity on Close-In Combat Effectiveness Prediction in Piloted Simulation

    NASA Technical Reports Server (NTRS)

    Persing, T. Ray; Bellish, Christine A.; Brandon, Jay; Kenney, P. Sean; Carzoo, Susan; Buttrill, Catherine; Guenther, Arlene

    2005-01-01

    Several aircraft airframe modeling approaches are currently being used in the DoD community for acquisition, threat evaluation, training, and other purposes. To date there has been no clear empirical study of the impact of airframe simulation fidelity on piloted real-time aircraft simulation study results, or when use of a particular level of fidelity is indicated. This paper documents a series of piloted simulation studies using three different levels of airframe model fidelity. This study was conducted using the NASA Langley Differential Maneuvering Simulator. Evaluations were conducted with three pilots for scenarios requiring extensive maneuvering of the airplanes during air combat. In many cases, a low-fidelity modified point-mass model may be sufficient to evaluate the combat effectiveness of the aircraft. However, in cases where high angle-of-attack flying qualities and aerodynamic performance are a factor or when precision tracking ability of the aircraft must be represented, use of high-fidelity models is indicated.

  11. Improving sea level simulation in Mediterranean regional climate models

    NASA Astrophysics Data System (ADS)

    Adloff, Fanny; Jordà, Gabriel; Somot, Samuel; Sevault, Florence; Arsouze, Thomas; Meyssignac, Benoit; Li, Laurent; Planton, Serge

    2017-08-01

    For now, the question about future sea level change in the Mediterranean remains a challenge. Previous climate modelling attempts to estimate future sea level change in the Mediterranean did not meet a consensus. The low resolution of CMIP-type models prevents an accurate representation of important small scales processes acting over the Mediterranean region. For this reason among others, the use of high resolution regional ocean modelling has been recommended in literature to address the question of ongoing and future Mediterranean sea level change in response to climate change or greenhouse gases emissions. Also, it has been shown that east Atlantic sea level variability is the dominant driver of the Mediterranean variability at interannual and interdecadal scales. However, up to now, long-term regional simulations of the Mediterranean Sea do not integrate the full sea level information from the Atlantic, which is a substantial shortcoming when analysing Mediterranean sea level response. In the present study we analyse different approaches followed by state-of-the-art regional climate models to simulate Mediterranean sea level variability. Additionally we present a new simulation which incorporates improved information of Atlantic sea level forcing at the lateral boundary. We evaluate the skills of the different simulations in the frame of long-term hindcast simulations spanning from 1980 to 2012 analysing sea level variability from seasonal to multidecadal scales. Results from the new simulation show a substantial improvement in the modelled Mediterranean sea level signal. This confirms that Mediterranean mean sea level is strongly influenced by the Atlantic conditions, and thus suggests that the quality of the information in the lateral boundary conditions (LBCs) is crucial for the good modelling of Mediterranean sea level. We also found that the regional differences inside the basin, that are induced by circulation changes, are model-dependent and thus not affected by the LBCs. Finally, we argue that a correct configuration of LBCs in the Atlantic should be used for future Mediterranean simulations, which cover hindcast period, but also for scenarios.

  12. Wang-Landau sampling: Saving CPU time

    NASA Astrophysics Data System (ADS)

    Ferreira, L. S.; Jorge, L. N.; Leão, S. A.; Caparica, A. A.

    2018-04-01

    In this work we propose an improvement to the Wang-Landau (WL) method that allows an economy in CPU time of about 60% leading to the same results with the same accuracy. We used the 2D Ising model to show that one can initiate all WL simulations using the outputs of an advanced WL level from a previous simulation. We showed that up to the seventh WL level (f6) the simulations are not biased yet and can proceed to any value that the simulation from the very beginning would reach. As a result the initial WL levels can be simulated just once. It was also observed that the saving in CPU time is larger for larger lattice sizes, exactly where the computational cost is considerable. We carried out high-resolution simulations beginning initially from the first WL level (f0) and another beginning from the eighth WL level (f7) using all the data at the end of the previous level and showed that the results for the critical temperature Tc and the critical static exponents β and γ coincide within the error bars. Finally we applied the same procedure to the 1/2-spin Baxter-Wu model and the economy in CPU time was of about 64%.

  13. Simulating Terrorism: Credible Commitment, Costly Signaling, and Strategic Behavior

    ERIC Educational Resources Information Center

    Siegel, David A.; Young, Joseph K.

    2009-01-01

    We present two simulations designed to convey the strategic nature of terrorism and counterterrorism. The first is a simulated hostage crisis, designed primarily to illustrate the concepts of credible commitment and costly signaling. The second explores high-level decision making of both a terrorist group and the state, and is designed to…

  14. Using Chemistry Simulations: Attention Capture, Selective Amnesia and Inattentional Blindness

    ERIC Educational Resources Information Center

    Rodrigues, Susan

    2011-01-01

    Twenty-one convenience sample student volunteers aged between 14-15 years worked in pairs (and one group of three) with two randomly allocated high quality conceptual (molecular level) and operational (mimicking wet labs) simulations. The volunteers were told they had five minutes to play, repeat, review, restart or stop the simulation, which in…

  15. Simulation-based modeling of building complexes construction management

    NASA Astrophysics Data System (ADS)

    Shepelev, Aleksandr; Severova, Galina; Potashova, Irina

    2018-03-01

    The study reported here examines the experience in the development and implementation of business simulation games based on network planning and management of high-rise construction. Appropriate network models of different types and levels of detail have been developed; a simulation model including 51 blocks (11 stages combined in 4 units) is proposed.

  16. The German VR Simulation Realism Scale--psychometric construction for virtual reality applications with virtual humans.

    PubMed

    Poeschl, Sandra; Doering, Nicola

    2013-01-01

    Virtual training applications with high levels of immersion or fidelity (for example for social phobia treatment) produce high levels of presence and therefore belong to the most successful Virtual Reality developments. Whereas display and interaction fidelity (as sub-dimensions of immersion) and their influence on presence are well researched, realism of the displayed simulation depends on the specific application and is therefore difficult to measure. We propose to measure simulation realism by using a self-report questionnaire. The German VR Simulation Realism Scale for VR training applications was developed based on a translation of scene realism items from the Witmer-Singer-Presence Questionnaire. Items for realism of virtual humans (for example for social phobia training applications) were supplemented. A sample of N = 151 students rated simulation realism of a Fear of Public Speaking application. Four factors were derived by item- and principle component analysis (Varimax rotation), representing Scene Realism, Audience Behavior, Audience Appearance and Sound Realism. The scale developed can be used as a starting point for future research and measurement of simulation realism for applications including virtual humans.

  17. The CD4+ T cell regulatory network mediates inflammatory responses during acute hyperinsulinemia: a simulation study.

    PubMed

    Martinez-Sanchez, Mariana E; Hiriart, Marcia; Alvarez-Buylla, Elena R

    2017-06-26

    Obesity is frequently linked to insulin resistance, high insulin levels, chronic inflammation, and alterations in the behaviour of CD4+ T cells. Despite the biomedical importance of this condition, the system-level mechanisms that alter CD4+ T cell differentiation and plasticity are not well understood. We model how hyperinsulinemia alters the dynamics of the CD4+ T regulatory network, and this, in turn, modulates cell differentiation and plasticity. Different polarizing microenvironments are simulated under basal and high levels of insulin to assess impacts on cell-fate attainment and robustness in response to transient perturbations. In the presence of high levels of insulin Th1 and Th17 become more stable to transient perturbations, and their basin sizes are augmented, Tr1 cells become less stable or disappear, while TGFβ producing cells remain unaltered. Hence, the model provides a dynamic system-level framework and explanation to further understand the documented and apparently paradoxical role of TGFβ in both inflammation and regulation of immune responses, as well as the emergence of the adipose Treg phenotype. Furthermore, our simulations provide new predictions on the impact of the microenvironment in the coexistence of the different cell types, suggesting that in pro-Th1, pro-Th2 and pro-Th17 environments effector and regulatory cells can coexist, but that high levels of insulin severely diminish regulatory cells, especially in a pro-Th17 environment. This work provides a first step towards a system-level formal and dynamic framework to integrate further experimental data in the study of complex inflammatory diseases.

  18. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  19. A meta-analysis of outcomes from the use of computer-simulated experiments in science education

    NASA Astrophysics Data System (ADS)

    Lejeune, John Van

    The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.

  20. morphforge: a toolbox for simulating small networks of biologically detailed neurons in Python

    PubMed Central

    Hull, Michael J.; Willshaw, David J.

    2014-01-01

    The broad structure of a modeling study can often be explained over a cup of coffee, but converting this high-level conceptual idea into graphs of the final simulation results may require many weeks of sitting at a computer. Although models themselves can be complex, often many mental resources are wasted working around complexities of the software ecosystem such as fighting to manage files, interfacing between tools and data formats, finding mistakes in code or working out the units of variables. morphforge is a high-level, Python toolbox for building and managing simulations of small populations of multicompartmental biophysical model neurons. An entire in silico experiment, including the definition of neuronal morphologies, channel descriptions, stimuli, visualization and analysis of results can be written within a single short Python script using high-level objects. Multiple independent simulations can be created and run from a single script, allowing parameter spaces to be investigated. Consideration has been given to the reuse of both algorithmic and parameterizable components to allow both specific and stochastic parameter variations. Some other features of the toolbox include: the automatic generation of human-readable documentation (e.g., PDF files) about a simulation; the transparent handling of different biophysical units; a novel mechanism for plotting simulation results based on a system of tags; and an architecture that supports both the use of established formats for defining channels and synapses (e.g., MODL files), and the possibility to support other libraries and standards easily. We hope that this toolbox will allow scientists to quickly build simulations of multicompartmental model neurons for research and serve as a platform for further tool development. PMID:24478690

  1. Description of waste pretreatment and interfacing systems dynamic simulation model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garbrick, D.J.; Zimmerman, B.D.

    1995-05-01

    The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggestedmore » average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage.« less

  2. High fidelity simulation based team training in urology: a preliminary interdisciplinary study of technical and nontechnical skills in laparoscopic complications management.

    PubMed

    Lee, Jason Y; Mucksavage, Phillip; Canales, Cecilia; McDougall, Elspeth M; Lin, Sharon

    2012-04-01

    Simulation based team training provides an opportunity to develop interdisciplinary communication skills and address potential medical errors in a high fidelity, low stakes environment. We evaluated the implementation of a novel simulation based team training scenario and assessed the technical and nontechnical performance of urology and anesthesiology residents. Urology residents were randomly paired with anesthesiology residents to participate in a simulation based team training scenario involving the management of 2 scripted critical events during laparoscopic radical nephrectomy, including the vasovagal response to pneumoperitoneum and renal vein injury during hilar dissection. A novel kidney surgical model and a high fidelity mannequin simulator were used for the simulation. A debriefing session followed each simulation based team training scenario. Assessments of technical and nontechnical performance were made using task specific checklists and global rating scales. A total of 16 residents participated, of whom 94% rated the simulation based team training scenario as useful for communication skill training. Also, 88% of urology residents believed that the kidney surgical model was useful for technical skill training. Urology resident training level correlated with technical performance (p=0.004) and blood loss during renal vein injury management (p=0.022) but not with nontechnical performance. Anesthesia resident training level correlated with nontechnical performance (p=0.036). Urology residents consistently rated themselves higher on nontechnical performance than did faculty (p=0.033). Anesthesia residents did not differ in the self-assessment of nontechnical performance compared to faculty assessments. Residents rated the simulation based team training scenario as useful for interdisciplinary communication skill training. Urology resident training level correlated with technical performance but not with nontechnical performance. Urology residents consistently overestimated their nontechnical performance. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  3. Predicting absorption and dispersion in acoustics by direct simulation Monte Carlo: Quantum and classical models for molecular relaxation.

    PubMed

    Hanford, Amanda D; O'Connor, Patrick D; Anderson, James B; Long, Lyle N

    2008-06-01

    In the current study, real gas effects in the propagation of sound waves are simulated using the direct simulation Monte Carlo method for a wide range of frequencies. This particle method allows for treatment of acoustic phenomena at high Knudsen numbers, corresponding to low densities and a high ratio of the molecular mean free path to wavelength. Different methods to model the internal degrees of freedom of diatomic molecules and the exchange of translational, rotational and vibrational energies in collisions are employed in the current simulations of a diatomic gas. One of these methods is the fully classical rigid-rotor/harmonic-oscillator model for rotation and vibration. A second method takes into account the discrete quantum energy levels for vibration with the closely spaced rotational levels classically treated. This method gives a more realistic representation of the internal structure of diatomic and polyatomic molecules. Applications of these methods are investigated in diatomic nitrogen gas in order to study the propagation of sound and its attenuation and dispersion along with their dependence on temperature. With the direct simulation method, significant deviations from continuum predictions are also observed for high Knudsen number flows.

  4. Level of Automation and Failure Frequency Effects on Simulated Lunar Lander Performance

    NASA Technical Reports Server (NTRS)

    Marquez, Jessica J.; Ramirez, Margarita

    2014-01-01

    A human-in-the-loop experiment was conducted at the NASA Ames Research Center Vertical Motion Simulator, where instrument-rated pilots completed a simulated terminal descent phase of a lunar landing. Ten pilots participated in a 2 x 2 mixed design experiment, with level of automation as the within-subjects factor and failure frequency as the between subjects factor. The two evaluated levels of automation were high (fully automated landing) and low (manual controlled landing). During test trials, participants were exposed to either a high number of failures (75% failure frequency) or low number of failures (25% failure frequency). In order to investigate the pilots' sensitivity to changes in levels of automation and failure frequency, the dependent measure selected for this experiment was accuracy of failure diagnosis, from which D Prime and Decision Criterion were derived. For each of the dependent measures, no significant difference was found for level of automation and no significant interaction was detected between level of automation and failure frequency. A significant effect was identified for failure frequency suggesting failure frequency has a significant effect on pilots' sensitivity to failure detection and diagnosis. Participants were more likely to correctly identify and diagnose failures if they experienced the higher levels of failures, regardless of level of automation

  5. Projected changes in extreme precipitation over Scotland and Northern England using a high-resolution regional climate model

    NASA Astrophysics Data System (ADS)

    Chan, Steven C.; Kahana, Ron; Kendon, Elizabeth J.; Fowler, Hayley J.

    2018-03-01

    The UK Met Office has previously conducted convection-permitting climate simulations over the southern UK (Kendon et al. in Nat Clim Change 4:570-576, 2014). The southern UK simulations have been followed up by a new set of northern UK simulations using the same model configuration. Here we present the mean and extreme precipitation projections from these new simulations. Relative to the southern UK, the northern UK projections show a greater summertime increase of return levels and extreme precipitation intensity in both 1.5 km convection-permitting and 12 km convection-parameterised simulations, but this increase is against a backdrop of large decreases in summertime mean precipitation and precipitation frequency. Similar to the southern UK, projected change is model resolution dependent and the convection-permitting simulation projects a larger intensification. For winter, return level increases are somewhat lower than for the southern UK. Analysis of model biases highlight challenges in simulating the diurnal cycle over high terrain, sensitivity to domain size and driving-GCM biases, and quality issues of radar precipitation observations, which are relevant to the wider regional climate modelling community.

  6. Characterization of extreme sea level at the European coast

    NASA Astrophysics Data System (ADS)

    Elizalde, Alberto; Jorda, Gabriel; Mathis, Moritz; Mikolajewicz, Uwe

    2015-04-01

    Extreme high sea levels arise as a combination of storm surges and particular high tides events. Future climate simulations not only project changes in the atmospheric circulation, which induces changes in the wind conditions, but also an increase in the global mean sea level by thermal expansion and ice melting. Such changes increase the risk of coastal flooding, which represents a possible hazard for human activities. Therefore, it is important to investigate the pattern of sea level variability and long-term trends at coastal areas. In order to analyze further extreme sea level events at the European coast in the future climate projections, a new setup for the global ocean model MPIOM coupled with the regional atmosphere model REMO is prepared. The MPIOM irregular grid has enhanced resolution in the European region to resolve the North and the Mediterranean Seas (up to 11 x 11 km at the North Sea). The ocean model includes as well the full luni-solar ephemeridic tidal potential for tides simulation. To simulate the air-sea interaction, the regional atmospheric model REMO is interactively coupled to the ocean model over Europe. Such region corresponds to the EuroCORDEX domain with a 50 x 50 km resolution. Besides the standard fluxes of heat, mass (freshwater), momentum and turbulent energy input, the ocean model is also forced with sea level pressure, in order to be able to capture the full variation of sea level. The hydrological budget within the study domain is closed using a hydrological discharge model. With this model, simulations for present climate and future climate scenarios are carried out to study transient changes on the sea level and extreme events. As a first step, two simulations (coupled and uncoupled ocean) driven by reanalysis data (ERA40) have been conducted. They are used as reference runs to evaluate the climate projection simulations. For selected locations at the coast side, time series of sea level are separated on its different components: tides, short time atmospheric process influence (1-30 days), seasonal cycle and interannual variability. Every sea level component is statistically compared with data from local tide gauges.

  7. Intuitive Expertise and Empowerment: The Long-Term Impact of Simulation Training on Changing Accountabilities in a Biotech Firm

    ERIC Educational Resources Information Center

    DiBello, Lia; Missildine, Whit; Struttman, Marie

    2009-01-01

    This paper describes a two-year study in which high levels of performance were achieved and sustained among so-called low-level workers in a biotech company. The purpose of the study--funded by National Science Foundation and lnvitrogen Corporation--were to explore the effectiveness of an accelerated learning Operational Simulation (OpSim)…

  8. Efficacy of Surgical Simulation Training in a Low-Income Country.

    PubMed

    Tansley, Gavin; Bailey, Jonathan G; Gu, Yuqi; Murray, Michelle; Livingston, Patricia; Georges, Ntakiyiruta; Hoogerboord, Marius

    2016-11-01

    Simulation training has evolved as an important component of postgraduate surgical education and has shown to be effective in teaching procedural skills. Despite potential benefits to low- and middle-income countries (LMIC), simulation training is predominately used in high-income settings. This study evaluates the effectiveness of simulation training in one LMIC (Rwanda). Twenty-six postgraduate surgical trainees at the University of Rwanda (Kigali, Rwanda) and Dalhousie University (Halifax, Canada) participated in the study. Participants attended one 3-hour simulation session using a high-fidelity, tissue-based model simulating the creation of an end ileostomy. Each participant was anonymously recorded completing the assigned task at three time points: prior to, immediately following, and 90 days following the simulation training. A single blinded expert reviewer assessed the performance using the Objective Structured Assessment of Technical Skill (OSATS) instrument. The mean OSATS score improvement for participants who completed all the assessments was 6.1 points [95 % Confidence Interval (CI) 2.2-9.9, p = 0.005]. Improvement was sustained over a 90-day period with a mean improvement of 4.1 points between the first and third attempts (95 % CI 0.3-7.9, p = 0.038). Simulation training was effective in both study sites, though most gains occurred with junior-level learners, with a mean improvement of 8.3 points (95 % CI 5.1-11.6, p < 0.001). Significant improvements were not identified for senior-level learners. This study supports the benefit for simulation in surgical training in LMICs. Skill improvements were limited to junior-level trainees. This work provides justification for investment in simulation-based curricula in Rwanda and potentially other LMICs.

  9. Extending the Multi-level Method for the Simulation of Stochastic Biological Systems.

    PubMed

    Lester, Christopher; Baker, Ruth E; Giles, Michael B; Yates, Christian A

    2016-08-01

    The multi-level method for discrete-state systems, first introduced by Anderson and Higham (SIAM Multiscale Model Simul 10(1):146-179, 2012), is a highly efficient simulation technique that can be used to elucidate statistical characteristics of biochemical reaction networks. A single point estimator is produced in a cost-effective manner by combining a number of estimators of differing accuracy in a telescoping sum, and, as such, the method has the potential to revolutionise the field of stochastic simulation. In this paper, we present several refinements of the multi-level method which render it easier to understand and implement, and also more efficient. Given the substantial and complex nature of the multi-level method, the first part of this work reviews existing literature, with the aim of providing a practical guide to the use of the multi-level method. The second part provides the means for a deft implementation of the technique and concludes with a discussion of a number of open problems.

  10. A path-level exact parallelization strategy for sequential simulation

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  11. Microcomputer Simulation of Real Gases--Part 1.

    ERIC Educational Resources Information Center

    Sperandeo-Mineo, R. M.; Tripi, G.

    1987-01-01

    Describes some simple computer programs designed to simulate the molecular dynamics of two-dimensional systems with a Lennard-Jones interaction potential. Discusses the use of the software in introductory physics courses at the high school and college level. (TW)

  12. Algorithms and architecture for multiprocessor based circuit simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deutsch, J.T.

    Accurate electrical simulation is critical to the design of high performance integrated circuits. Logic simulators can verify function and give first-order timing information. Switch level simulators are more effective at dealing with charge sharing than standard logic simulators, but cannot provide accurate timing information or discover DC problems. Delay estimation techniques and cell level simulation can be used in constrained design methods, but must be tuned for each application, and circuit simulation must still be used to generate the cell models. None of these methods has the guaranteed accuracy that many circuit designers desire, and none can provide detailed waveformmore » information. Detailed electrical-level simulation can predict circuit performance if devices and parasitics are modeled accurately. However, the computational requirements of conventional circuit simulators make it impractical to simulate current large circuits. In this dissertation, the implementation of Iterated Timing Analysis (ITA), a relaxation-based technique for accurate circuit simulation, on a special-purpose multiprocessor is presented. The ITA method is an SOR-Newton, relaxation-based method which uses event-driven analysis and selective trace to exploit the temporal sparsity of the electrical network. Because event-driven selective trace techniques are employed, this algorithm lends itself to implementation on a data-driven computer.« less

  13. Rotorcraft Research at the NASA Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Aponso, Bimal Lalith; Tran, Duc T.; Schroeder, Jeffrey A.

    2009-01-01

    In the 1970 s the role of the military helicopter evolved to encompass more demanding missions including low-level nap-of-the-earth flight and operation in severely degraded visual environments. The Vertical Motion Simulator (VMS) at the NASA Ames Research Center was built to provide a high-fidelity simulation capability to research new rotorcraft concepts and technologies that could satisfy these mission requirements. The VMS combines a high-fidelity large amplitude motion system with an adaptable simulation environment including interchangeable and configurable cockpits. In almost 30 years of operation, rotorcraft research on the VMS has contributed significantly to the knowledge-base on rotorcraft performance, handling qualities, flight control, and guidance and displays. These contributions have directly benefited current rotorcraft programs and flight safety. The high fidelity motion system in the VMS was also used to research simulation fidelity. This research provided a fundamental understanding of pilot cueing modalities and their effect on simulation fidelity.

  14. Tropical Cyclone Activity in the High-Resolution Community Earth System Model and the Impact of Ocean Coupling

    NASA Astrophysics Data System (ADS)

    Li, Hui; Sriver, Ryan L.

    2018-01-01

    High-resolution Atmosphere General Circulation Models (AGCMs) are capable of directly simulating realistic tropical cyclone (TC) statistics, providing a promising approach for TC-climate studies. Active air-sea coupling in a coupled model framework is essential to capturing TC-ocean interactions, which can influence TC-climate connections on interannual to decadal time scales. Here we investigate how the choices of ocean coupling can affect the directly simulated TCs using high-resolution configurations of the Community Earth System Model (CESM). We performed a suite of high-resolution, multidecadal, global-scale CESM simulations in which the atmosphere (˜0.25° grid spacing) is configured with three different levels of ocean coupling: prescribed climatological sea surface temperature (SST) (ATM), mixed layer ocean (SLAB), and dynamic ocean (CPL). We find that different levels of ocean coupling can influence simulated TC frequency, geographical distributions, and storm intensity. ATM simulates more storms and higher overall storm intensity than the coupled simulations. It also simulates higher TC track density over the eastern Pacific and the North Atlantic, while TC tracks are relatively sparse within CPL and SLAB for these regions. Storm intensification and the maximum wind speed are sensitive to the representations of local surface flux feedbacks in different coupling configurations. Key differences in storm number and distribution can be attributed to variations in the modeled large-scale climate mean state and variability that arise from the combined effect of intrinsic model biases and air-sea interactions. Results help to improve our understanding about the representation of TCs in high-resolution coupled Earth system models, with important implications for TC-climate applications.

  15. First experiences of high-fidelity simulation training in junior nursing students in Korea.

    PubMed

    Lee, Suk Jeong; Kim, Sang Suk; Park, Young-Mi

    2015-07-01

    This study was conducted to explore first experiences of high-fidelity simulation training in Korean nursing students, in order to develop and establish more effective guidelines for future simulation training in Korea. Thirty-three junior nursing students participated in high-fidelity simulation training for the first time. Using both qualitative and quantitative methods, data were collected from reflective journals and questionnaires of simulation effectiveness after simulation training. Descriptive statistics were used to analyze simulation effectiveness and content analysis was performed with the reflective journal data. Five dimensions and 31 domains, both positive and negative experiences, emerged from qualitative analysis: (i) machine-human interaction in a safe environment; (ii) perceived learning capability; (iii) observational learning; (iv) reconciling practice with theory; and (v) follow-up debriefing effect. More than 70% of students scored high on increased ability to identify changes in the patient's condition, critical thinking, decision-making, effectiveness of peer observation, and debriefing in effectiveness of simulation. This study reported both positive and negative experiences of simulation. The results of this study could be used to set the level of task difficulty in simulation. Future simulation programs can be designed by reinforcing the positive experiences and modifying the negative results. © 2014 The Authors. Japan Journal of Nursing Science © 2014 Japan Academy of Nursing Science.

  16. Use of simulated pages to prepare medical students for internship and improve patient safety.

    PubMed

    Schwind, Cathy J; Boehler, Margaret L; Markwell, Stephen J; Williams, Reed G; Brenner, Michael J

    2011-01-01

    During the transition from medical school to internship, trainees experience high levels of stress related to pages on the inpatient wards. The steep learning curve during this period may also affect patient safety. The authors piloted the use of simulated pages to improve medical student preparedness, decrease stress related to pages, and familiarize medical students with common patient problems. A multidisciplinary team at Southern Illinois University School of Medicine developed simulated pages that were tested among senior medical students. Sixteen medical students were presented with 11 common patient scenarios. Data on assessment, management, and global performance were collected. Mean confidence levels were evaluated pre- and postintervention. Students were also surveyed on how the simulated pages program influenced their perceived comfort in managing patient care needs and the usefulness of the exercise in preparing them to handle inpatient pages. Mean scores on the assessment and management portions of the scenarios varied widely depending on the scenario (range -15.6 ± 41.6 to 95.7 ± 9.5). Pass rates based on global performance ranged from 12% to 93%. Interrater agreement was high (mean kappa = 0.88). Students' confidence ratings on a six-point scale increased from 1.87 preintervention to 3.53 postintervention (P < .0001). Simulated pages engage medical students and may foster medical student preparedness for internship. Students valued the opportunity to simulate "on call" responsibilities, and exposure to simulated pages significantly increased their confidence levels. Further studies are needed to determine effects on patient safety outcomes.

  17. Sound level exposure of high-risk infants in different environmental conditions.

    PubMed

    Byers, Jacqueline F; Waugh, W Randolph; Lowman, Linda B

    2006-01-01

    To provide descriptive information about the sound levels to which high-risk infants are exposed in various actual environmental conditions in the NICU, including the impact of physical renovation on sound levels, and to assess the contributions of various types of equipment, alarms, and activities to sound levels in simulated conditions in the NICU. Descriptive and comparative design. Convenience sample of 134 infants at a southeastern quarternary children's hospital. A-weighted decibel (dBA) sound levels under various actual and simulated environmental conditions. The renovated NICU was, on average, 4-6 dBA quieter across all environmental conditions than a comparable nonrenovated room, representing a significant sound level reduction. Sound levels remained above consensus recommendations despite physical redesign and staff training. Respiratory therapy equipment, alarms, staff talking, and infant fussiness contributed to higher sound levels. Evidence-based sound-reducing strategies are proposed. Findings were used to plan environment management as part of a developmental, family-centered care, performance improvement program and in new NICU planning.

  18. Virtual reality simulation training of mastoidectomy - studies on novice performance.

    PubMed

    Andersen, Steven Arild Wuyts

    2016-08-01

    Virtual reality (VR) simulation-based training is increasingly used in surgical technical skills training including in temporal bone surgery. The potential of VR simulation in enabling high-quality surgical training is great and VR simulation allows high-stakes and complex procedures such as mastoidectomy to be trained repeatedly, independent of patients and surgical tutors, outside traditional learning environments such as the OR or the temporal bone lab, and with fewer of the constraints of traditional training. This thesis aims to increase the evidence-base of VR simulation training of mastoidectomy and, by studying the final-product performances of novices, investigates the transfer of skills to the current gold-standard training modality of cadaveric dissection, the effect of different practice conditions and simulator-integrated tutoring on performance and retention of skills, and the role of directed, self-regulated learning. Technical skills in mastoidectomy were transferable from the VR simulation environment to cadaveric dissection with significant improvement in performance after directed, self-regulated training in the VR temporal bone simulator. Distributed practice led to a better learning outcome and more consolidated skills than massed practice and also resulted in a more consistent performance after three months of non-practice. Simulator-integrated tutoring accelerated the initial learning curve but also caused over-reliance on tutoring, which resulted in a drop in performance when the simulator-integrated tutor-function was discontinued. The learning curves were highly individual but often plateaued early and at an inadequate level, which related to issues concerning both the procedure and the VR simulator, over-reliance on the tutor function and poor self-assessment skills. Future simulator-integrated automated assessment could potentially resolve some of these issues and provide trainees with both feedback during the procedure and immediate assessment following each procedure. Standard setting by establishing a proficiency level that can be used for mastery learning with deliberate practice could also further sophisticate directed, self-regulated learning in VR simulation-based training. VR simulation-based training should be embedded in a systematic and competency-based training curriculum for high-quality surgical skills training, ultimately leading to improved safety and patient care.

  19. Face and content validity of the virtual reality simulator 'ScanTrainer®'.

    PubMed

    Alsalamah, Amal; Campo, Rudi; Tanos, Vasilios; Grimbizis, Gregoris; Van Belle, Yves; Hood, Kerenza; Pugh, Neil; Amso, Nazar

    2017-01-01

    Ultrasonography is a first-line imaging in the investigation of women's irregular bleeding and other gynaecological pathologies, e.g. ovarian cysts and early pregnancy problems. However, teaching ultrasound, especially transvaginal scanning, remains a challenge for health professionals. New technology such as simulation may potentially facilitate and expedite the process of learning ultrasound. Simulation may prove to be realistic, very close to real patient scanning experience for the sonographer and objectively able to assist the development of basic skills such as image manipulation, hand-eye coordination and examination technique. The aim of this study was to determine the face and content validity of a virtual reality simulator (ScanTrainer®, MedaPhor plc, Cardiff, Wales, UK) as reflective of real transvaginal ultrasound (TVUS) scanning. A questionnaire with 14 simulator-related statements was distributed to a number of participants with differing levels of sonography experience in order to determine the level of agreement between the use of the simulator in training and real practice. There were 36 participants: novices ( n  = 25) and experts ( n  = 11) who rated the simulator. Median scores of face validity statements between experts and non-experts using a 10-point visual analogue scale (VAS) ratings ranged between 7.5 and 9.0 ( p  > 0.05) indicated a high level of agreement. Experts' median scores of content validity statements ranged from 8.4 to 9.0. The findings confirm that the simulator has the feel and look of real-time scanning with high face validity. Similarly, its tutorial structures and learning steps confirm the content validity.

  20. Integrated Fault Diagnosis Algorithm for Motor Sensors of In-Wheel Independent Drive Electric Vehicles.

    PubMed

    Jeon, Namju; Lee, Hyeongcheol

    2016-12-12

    An integrated fault-diagnosis algorithm for a motor sensor of in-wheel independent drive electric vehicles is presented. This paper proposes a method that integrates the high- and low-level fault diagnoses to improve the robustness and performance of the system. For the high-level fault diagnosis of vehicle dynamics, a planar two-track non-linear model is first selected, and the longitudinal and lateral forces are calculated. To ensure redundancy of the system, correlation between the sensor and residual in the vehicle dynamics is analyzed to detect and separate the fault of the drive motor system of each wheel. To diagnose the motor system for low-level faults, the state equation of an interior permanent magnet synchronous motor is developed, and a parity equation is used to diagnose the fault of the electric current and position sensors. The validity of the high-level fault-diagnosis algorithm is verified using Carsim and Matlab/Simulink co-simulation. The low-level fault diagnosis is verified through Matlab/Simulink simulation and experiments. Finally, according to the residuals of the high- and low-level fault diagnoses, fault-detection flags are defined. On the basis of this information, an integrated fault-diagnosis strategy is proposed.

  1. Numerical Zooming Between a NPSS Engine System Simulation and a One-Dimensional High Compressor Analysis Code

    NASA Technical Reports Server (NTRS)

    Follen, Gregory; auBuchon, M.

    2000-01-01

    Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.

  2. Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment

    NASA Astrophysics Data System (ADS)

    Zeigler, Bernard P.; Lee, J. S.

    1998-08-01

    In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.

  3. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  4. Monte Carlo simulation of Alaska wolf survival

    NASA Astrophysics Data System (ADS)

    Feingold, S. J.

    1996-02-01

    Alaskan wolves live in a harsh climate and are hunted intensively. Penna's biological aging code, using Monte Carlo methods, has been adapted to simulate wolf survival. It was run on the case in which hunting causes the disruption of wolves' social structure. Social disruption was shown to increase the number of deaths occurring at a given level of hunting. For high levels of social disruption, the population did not survive.

  5. FY10 Report on Multi-scale Simulation of Solvent Extraction Processes: Molecular-scale and Continuum-scale Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wardle, Kent E.; Frey, Kurt; Pereira, Candido

    2014-02-02

    This task is aimed at predictive modeling of solvent extraction processes in typical extraction equipment through multiple simulation methods at various scales of resolution. We have conducted detailed continuum fluid dynamics simulation on the process unit level as well as simulations of the molecular-level physical interactions which govern extraction chemistry. Through combination of information gained through simulations at each of these two tiers along with advanced techniques such as the Lattice Boltzmann Method (LBM) which can bridge these two scales, we can develop the tools to work towards predictive simulation for solvent extraction on the equipment scale (Figure 1). Themore » goal of such a tool-along with enabling optimized design and operation of extraction units-would be to allow prediction of stage extraction effrciency under specified conditions. Simulation efforts on each of the two scales will be described below. As the initial application of FELBM in the work performed during FYl0 has been on annular mixing it will be discussed in context of the continuum-scale. In the future, however, it is anticipated that the real value of FELBM will be in its use as a tool for sub-grid model development through highly refined DNS-like multiphase simulations facilitating exploration and development of droplet models including breakup and coalescence which will be needed for the large-scale simulations where droplet level physics cannot be resolved. In this area, it can have a significant advantage over traditional CFD methods as its high computational efficiency allows exploration of significantly greater physical detail especially as computational resources increase in the future.« less

  6. Testing for a CO2 fertilization effect on growth of Canadian boreal forests

    NASA Astrophysics Data System (ADS)

    Girardin, Martin P.; Bernier, Pierre Y.; Raulier, FréDéRic; Tardif, Jacques C.; Conciatori, France; Guo, Xiao Jing

    2011-03-01

    The CO2 fertilization hypothesis stipulates that rising atmospheric CO2 has a direct positive effect on net primary productivity (NPP), with experimental evidence suggesting a 23% growth enhancement with a doubling of CO2. Here, we test this hypothesis by comparing a bioclimatic model simulation of NPP over the twentieth century against tree growth increment (TGI) data of 192 Pinus banksiana trees from the Duck Mountain Provincial Forest in Manitoba, Canada. We postulate that, if a CO2 fertilization effect has occurred, climatically driven simulations of NPP and TGI will diverge with increasing CO2. We use a two-level scaling approach to simulate NPP. A leaf-level model is first used to simulate high-frequency responses to climate variability. A canopy-level model of NPP is then adjusted to the aggregated leaf-level results and used to simulate yearly plot-level NPP. Neither model accounts for CO2 fertilization. The climatically driven simulations of NPP for 1912-2000 are effective for tracking the measured year-to-year variations in TGI, with 47.2% of the variance in TGI reproduced by the simulation. In addition, the simulation reproduces without divergence the positive linear trend detected in TGI over the same period. Our results therefore do not support the attribution of a portion of the historical linear trend in TGI to CO2 fertilization at the level suggested by current experimental evidence. A sensitivity analysis done by adding an expected CO2 fertilization effect to simulations suggests that the detection limit of the study is for a 14% growth increment with a doubling of atmospheric CO2 concentration.

  7. The moving confluence route technology with WAD scheme for 3D hydrodynamic simulation in high altitude inland waters

    NASA Astrophysics Data System (ADS)

    Wang, Yonggui; Yang, Yinqun; Chen, Xiaolong; Engel, Bernard A.; Zhang, Wanshun

    2018-04-01

    For three-dimensional hydrodynamic simulations in inland waters, the rapid changes with moving boundary and various input conditions should be considered. Some models are developed with moving boundary but the dynamic change of discharges is unresolved or ignored. For better hydrodynamic simulation in inland waters, the widely used 3D model, ECOMSED, has been improved by moving confluence route (MCR) method with a wetting and drying scheme (WAD). The fixed locations of water and pollutants inputs from tributaries, point sources and non-point sources have been changed to dynamic confluence routes as the boundary moving. The improved model was applied in an inland water area, Qingshuihai reservoir, Kunming City, China, for a one-year hydrodynamic simulation. The results were verified by water level, flow velocity and water mass conservation. Detailed water level variation analysis and velocity field comparison at different times showed that the improved model has better performance for simulating the boundary moving phenomenon and moving discharges along with water level changing than the original one. The improved three-dimensional model is available for hydrodynamics simulation in water bodies where water boundary shifts along with change of water level and have various inlets.

  8. Simulation Exploration Experience 2018 Overview

    NASA Technical Reports Server (NTRS)

    Paglialonga, Stephen; Elfrey, Priscilla; Crues, Edwin Z.

    2018-01-01

    The Simulation Exploration Experience (SEE) joins students, industry, professional associations, and faculty together for an annual modeling and simulation (M&S) challenge. SEE champions collaborative collegiate-level modeling and simulation by providing a venue for students to work in highly dispersed inter-university teams to design, develop, test, and execute simulated missions associated with space exploration. Participating teams gain valuable knowledge, skills, and increased employability by working closely with industry professionals, NASA, and faculty advisors. This presentation gives and overview of the SEE and the upcoming 2018 SEE event.

  9. A large high vacuum, high pumping speed space simulation chamber for electric propulsion

    NASA Technical Reports Server (NTRS)

    Grisnik, Stanley P.; Parkes, James E.

    1994-01-01

    Testing high power electric propulsion devices poses unique requirements on space simulation facilities. Very high pumping speeds are required to maintain high vacuum levels while handling large volumes of exhaust products. These pumping speeds are significantly higher than those available in most existing vacuum facilities. There is also a requirement for relatively large vacuum chamber dimensions to minimize facility wall/thruster plume interactions and to accommodate far field plume diagnostic measurements. A 4.57 m (15 ft) diameter by 19.2 m (63 ft) long vacuum chamber at NASA Lewis Research Center is described. The chamber utilizes oil diffusion pumps in combination with cryopanels to achieve high vacuum pumping speeds at high vacuum levels. The facility is computer controlled for all phases of operation from start-up, through testing, to shutdown. The computer control system increases the utilization of the facility and reduces the manpower requirements needed for facility operations.

  10. Efficient parallelization for AMR MHD multiphysics calculations; implementation in AstroBEAR

    NASA Astrophysics Data System (ADS)

    Carroll-Nellenback, Jonathan J.; Shroyer, Brandon; Frank, Adam; Ding, Chen

    2013-03-01

    Current adaptive mesh refinement (AMR) simulations require algorithms that are highly parallelized and manage memory efficiently. As compute engines grow larger, AMR simulations will require algorithms that achieve new levels of efficient parallelization and memory management. We have attempted to employ new techniques to achieve both of these goals. Patch or grid based AMR often employs ghost cells to decouple the hyperbolic advances of each grid on a given refinement level. This decoupling allows each grid to be advanced independently. In AstroBEAR we utilize this independence by threading the grid advances on each level with preference going to the finer level grids. This allows for global load balancing instead of level by level load balancing and allows for greater parallelization across both physical space and AMR level. Threading of level advances can also improve performance by interleaving communication with computation, especially in deep simulations with many levels of refinement. While we see improvements of up to 30% on deep simulations run on a few cores, the speedup is typically more modest (5-20%) for larger scale simulations. To improve memory management we have employed a distributed tree algorithm that requires processors to only store and communicate local sections of the AMR tree structure with neighboring processors. Using this distributed approach we are able to get reasonable scaling efficiency (>80%) out to 12288 cores and up to 8 levels of AMR - independent of the use of threading.

  11. The Effect of Nursing Faculty Presence on Students' Level of Anxiety, Self-Confidence, and Clinical Performance during a Clinical Simulation Experience

    ERIC Educational Resources Information Center

    Horsley, Trisha Leann

    2012-01-01

    Nursing schools design their clinical simulation labs based upon faculty's perception of the optimal environment to meet the students' learning needs, other programs' success with integrating high-tech clinical simulation, and the funds available. No research has been conducted on nursing faculty presence during a summative evaluation. The…

  12. A Modal Model to Simulate Typical Structural Dynamic Nonlinearity [PowerPoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayes, Randall L.; Pacini, Benjamin Robert; Roettgen, Dan

    2016-01-01

    Some initial investigations have been published which simulate nonlinear response with almost traditional modal models: instead of connecting the modal mass to ground through the traditional spring and damper, a nonlinear Iwan element was added. This assumes that the mode shapes do not change with amplitude and there are no interactions between modal degrees of freedom. This work expands on these previous studies. An impact experiment is performed on a structure which exhibits typical structural dynamic nonlinear response, i.e. weak frequency dependence and strong damping dependence on the amplitude of vibration. Use of low level modal test results in combinationmore » with high level impacts are processed using various combinations of modal filtering, the Hilbert Transform and band-pass filtering to develop response data that are then fit with various nonlinear elements to create a nonlinear pseudo-modal model. Simulations of forced response are compared with high level experimental data for various nonlinear element assumptions.« less

  13. A Modal Model to Simulate Typical Structural Dynamic Nonlinearity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pacini, Benjamin Robert; Mayes, Randall L.; Roettgen, Daniel R

    2015-10-01

    Some initial investigations have been published which simulate nonlinear response with almost traditional modal models: instead of connecting the modal mass to ground through the traditional spring and damper, a nonlinear Iwan element was added. This assumes that the mode shapes do not change with amplitude and there are no interactions between modal degrees of freedom. This work expands on these previous studies. An impact experiment is performed on a structure which exhibits typical structural dynamic nonlinear response, i.e. weak frequency dependence and strong damping dependence on the amplitude of vibration. Use of low level modal test results in combinationmore » with high level impacts are processed using various combinations of modal filtering, the Hilbert Transform and band-pass filtering to develop response data that are then fit with various nonlinear elements to create a nonlinear pseudo-modal model. Simulations of forced response are compared with high level experimental data for various nonlinear element assumptions.« less

  14. HYDRA : High-speed simulation architecture for precision spacecraft formation simulation

    NASA Technical Reports Server (NTRS)

    Martin, Bryan J.; Sohl, Garett.

    2003-01-01

    e Hierarchical Distributed Reconfigurable Architecture- is a scalable simulation architecture that provides flexibility and ease-of-use which take advantage of modern computation and communication hardware. It also provides the ability to implement distributed - or workstation - based simulations and high-fidelity real-time simulation from a common core. Originally designed to serve as a research platform for examining fundamental challenges in formation flying simulation for future space missions, it is also finding use in other missions and applications, all of which can take advantage of the underlying Object-Oriented structure to easily produce distributed simulations. Hydra automates the process of connecting disparate simulation components (Hydra Clients) through a client server architecture that uses high-level descriptions of data associated with each client to find and forge desirable connections (Hydra Services) at run time. Services communicate through the use of Connectors, which abstract messaging to provide single-interface access to any desired communication protocol, such as from shared-memory message passing to TCP/IP to ACE and COBRA. Hydra shares many features with the HLA, although providing more flexibility in connectivity services and behavior overriding.

  15. A simple simulation model as a tool to assess alternative health care provider payment reform options in Vietnam.

    PubMed

    Cashin, Cheryl; Phuong, Nguyen Khanh; Shain, Ryan; Oanh, Tran Thi Mai; Thuy, Nguyen Thi

    2015-01-01

    Vietnam is currently considering a revision of its 2008 Health Insurance Law, including the regulation of provider payment methods. This study uses a simple spreadsheet-based, micro-simulation model to analyse the potential impacts of different provider payment reform scenarios on resource allocation across health care providers in three provinces in Vietnam, as well as on the total expenditure of the provincial branches of the public health insurance agency (Provincial Social Security [PSS]). The results show that currently more than 50% of PSS spending is concentrated at the provincial level with less than half at the district level. There is also a high degree of financial risk on district hospitals with the current fund-holding arrangement. Results of the simulation model show that several alternative scenarios for provider payment reform could improve the current payment system by reducing the high financial risk currently borne by district hospitals without dramatically shifting the current level and distribution of PSS expenditure. The results of the simulation analysis provided an empirical basis for health policy-makers in Vietnam to assess different provider payment reform options and make decisions about new models to support health system objectives.

  16. Environmental Planning in Jonah's Basin: A Simulation Game and Experimental Analysis.

    ERIC Educational Resources Information Center

    Horsley, Doyne

    1982-01-01

    Described is a successfully field tested simulation which will help high school or college level students become familiar with flood hazards. Students assume the roles of members of the Jonah's Basin planning commission and plan solutions to the area's flood problems. (RM)

  17. Residual stress investigation of via-last through-silicon via by polarized Raman spectroscopy measurement and finite element simulation

    NASA Astrophysics Data System (ADS)

    Feng, Wei; Watanabe, Naoya; Shimamoto, Haruo; Aoyagi, Masahiro; Kikuchi, Katsuya

    2018-07-01

    The residual stresses induced around through-silicon vias (TSVs) by a fabrication process is one of the major concerns of reliability. We proposed a methodology to investigate the residual stress in a via-last TSV. Firstly, radial and axial thermal stresses were measured by polarized Raman spectroscopy. The agreement between the simulated stress level and measured results validated the detail simulation model. Furthermore, the validated simulation model was adopted to the study of residual stress by element death/birth methods. The residual stress at room temperature concentrates at passivation layers owing to the high fabrication process temperatures of 420 °C for SiN film and 350 °C for SiO2 films. For a Si substrate, a high-level stress was observed near potential device locations, which requires attention to address reliability concerns in stress-sensitive devices. This methodology of residual stress analysis can be adopted to investigate the residual stress in other devices.

  18. Pilot-Induced Oscillation Prediction With Three Levels of Simulation Motion Displacement

    NASA Technical Reports Server (NTRS)

    Schroeder, Jeffery A.; Chung, William W. Y.; Tran, Duc T.; Laforce, Soren; Bengford, Norman J.

    2001-01-01

    Simulator motion platform characteristics were examined to determine if the amount of motion affects pilot-induced oscillation (PIO) prediction. Five test pilots evaluated how susceptible 18 different sets of pitch dynamics were to PIOs with three different levels of simulation motion platform displacement: large, small, and none. The pitch dynamics were those of a previous in-flight experiment, some of which elicited PIOs These in-flight results served as truth data for the simulation. As such, the in-flight experiment was replicated as much as possible. Objective and subjective data were collected and analyzed With large motion, PIO and handling qualities ratings matched the flight data more closely than did small motion or no motion. Also, regardless of the aircraft dynamics, large motion increased pilot confidence in assigning handling qualifies ratings, reduced safety pilot trips, and lowered touchdown velocities. While both large and small motion provided a pitch rate cue of high fidelity, only large motion presented the pilot with a high fidelity vertical acceleration cue.

  19. Midwifery students' experiences of simulation- and skills training.

    PubMed

    Lendahls, Lena; Oscarsson, Marie G

    2017-03-01

    In Sweden, simulation- and skills training are implemented in midwifery education in order to prepare students for clinical practice. Research regarding the use of both low to high levels of fidelity in simulation in midwifery programme is limited. The aim of this study was to explore midwifery students' experiences of simulation- and skills training. Midwifery students (n=61), at advanced level, were interviewed in 13 group interviews from 2011 to 2105. A semi-structured interview guide was used, and data were analysed by content analysis. The results are presented in four main categories: develops hands on skills and communication, power of collaborative learning, highly valued learning environment and facilitates clinical practice. The majority of students felt that the simulation- and skills training were necessary to become familiar with hands on skills. Having repetitive practices in a safe and secure environment was viewed as important, and students highly valued that mistakes could be made without fear of comprising patient safety. Student's collaboration, reflections and critical thinking increased learning ability. Simulation- and skills training created links between theory and practice, and the lecturer had an important role in providing instructions and feedback. Students felt prepared and confident before their clinical practice, and simulation- and skills training increased safety for all involved, resulting in students being more confident, as patients in clinical practice became less exposed. Furthermore, mentors were satisfied with students' basic skills. Simulation- and skills training support the development of midwifery skills. It creates links between theory and practice, which facilitates students' learning ability. Training needs to include reflections and critical thinking in order to develop their learning. The lecturer has an important role in encouraging time for reflections and creating safe environment during the skills and simulation training. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Simulation of Laser Induced Thermal Damage in Nd:YVO4 Crystals

    NASA Astrophysics Data System (ADS)

    Nagi, Richie

    Neodymium-doped yttrium orthovanadate (Nd:YVO4) is a commonly used gain medium in Diode Pumped Solid State (DPSS) lasers, but high heat loading of Nd:YVO4 at high pump powers (≥ 5 W) leads to thermal distortions and crystal fracture, which limits the utility of Nd:YVO 4 for high power applications. In this thesis, a Nd:YVO4 crystal suffered thermal damage during experiments for investigating the optical gain characteristics of the crystal. This thesis examines the thermal damage mechanisms in detail. Principally, laser induced melting, as well as laser induced thermal stress fracture were studied, all in the absence of stimulated emission in the crystal. The optical system for coupling the pump laser light into the crystal was first simulated in Zemax, an optical design software, and the simulations were then compared to the experimental coupling efficiency results, which were found to be in agreement. The simulations for the laser coupling system were then used in conjunction with LASCAD, a finite element analysis software, to obtain the temperatures inside the crystal, as a function of optical power coupled into the crystal. The temperature simulations were then compared to the experimental results, which were in excellent agreement, and the temperature simulations were then generalized to other crystal geometries and Nd doping levels. Zemax and LASCAD were also used to simulate the thermal stress in the crystal as a function of the coupled optical power, and the simulations were compared to experiments, both of which were found to be in agreement. The thermal stress simulations were then generalized to different crystal geometries and Nd doping levels as well.

  1. A modeling study of coastal inundation induced by storm surge, sea-level rise, and subsidence in the Gulf of Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Wang, Taiping; Leung, Lai-Yung R.

    The northern coasts of the Gulf of Mexico are highly vulnerable to the direct threats of climate change, such as hurricane-induced storm surge, and such risks can be potentially exacerbated by land subsidence and global sea level rise. This paper presents an application of a coastal storm surge model to study the coastal inundation process induced by tide and storm surge, and its response to the effects of land subsidence and sea level rise in the northern Gulf coast. An unstructured-grid Finite Volume Coastal Ocean Model was used to simulate tides and hurricane-induced storm surges in the Gulf of Mexico.more » Simulated distributions of co-amplitude and co-phase of semi-diurnal and diurnal tides are in good agreement with previous modeling studies. The storm surges induced by four historical hurricanes (Rita, Katrina, Ivan and Dolly) were simulated and compared to observed water levels at National Oceanic and Atmospheric Administration tide stations. Effects of coastal subsidence and future global sea level rise on coastal inundation in the Louisiana coast were evaluated using a parameter “change of inundation depth” through sensitivity simulations that were based on a projected future subsidence scenario and 1-m global sea level rise by the end of the century. Model results suggested that hurricane-induced storm surge height and coastal inundation could be exacerbated by future global sea level rise and subsidence, and that responses of storm surge and coastal inundation to the effects of sea level rise and subsidence are highly nonlinear and vary on temporal and spatial scales.« less

  2. On the Scaling Laws and Similarity Spectra for Jet Noise in Subsonic and Supersonic Flow

    NASA Technical Reports Server (NTRS)

    Kandula, Max

    2008-01-01

    The scaling laws for the simulation of noise from subsonic and ideally expanded supersonic jets are reviewed with regard to their applicability to deduce full-scale conditions from small-scale model testing. Important parameters of scale model testing for the simulation of jet noise are identified, and the methods of estimating full- scale noise levels from simulated scale model data are addressed. The limitations of cold-jet data in estimating high-temperature supersonic jet noise levels are discussed. New results are presented showing the dependence of overall sound power level on the jet temperature ratio at various jet Mach numbers. A generalized similarity spectrum is also proposed, which accounts for convective Mach number and angle to the jet axis.

  3. The Impact of Different Absolute Solar Irradiance Values on Current Climate Model Simulations

    NASA Technical Reports Server (NTRS)

    Rind, David H.; Lean, Judith L.; Jonas, Jeffrey

    2014-01-01

    Simulations of the preindustrial and doubled CO2 climates are made with the GISS Global Climate Middle Atmosphere Model 3 using two different estimates of the absolute solar irradiance value: a higher value measured by solar radiometers in the 1990s and a lower value measured recently by the Solar Radiation and Climate Experiment. Each of the model simulations is adjusted to achieve global energy balance; without this adjustment the difference in irradiance produces a global temperature change of 0.48C, comparable to the cooling estimated for the Maunder Minimum. The results indicate that by altering cloud cover the model properly compensates for the different absolute solar irradiance values on a global level when simulating both preindustrial and doubled CO2 climates. On a regional level, the preindustrial climate simulations and the patterns of change with doubled CO2 concentrations are again remarkably similar, but there are some differences. Using a higher absolute solar irradiance value and the requisite cloud cover affects the model's depictions of high-latitude surface air temperature, sea level pressure, and stratospheric ozone, as well as tropical precipitation. In the climate change experiments it leads to an underestimation of North Atlantic warming, reduced precipitation in the tropical western Pacific, and smaller total ozone growth at high northern latitudes. Although significant, these differences are typically modest compared with the magnitude of the regional changes expected for doubled greenhouse gas concentrations. Nevertheless, the model simulations demonstrate that achieving the highest possible fidelity when simulating regional climate change requires that climate models use as input the most accurate (lower) solar irradiance value.

  4. Comparison of standardized patients with high-fidelity simulators for managing stress and improving performance in clinical deterioration: A mixed methods study.

    PubMed

    Ignacio, Jeanette; Dolmans, Diana; Scherpbier, Albert; Rethans, Jan-Joost; Chan, Sally; Liaw, Sok Ying

    2015-12-01

    The use of standardized patients in deteriorating patient simulations adds realism that can be valuable for preparing nurse trainees for stress and enhancing their performance during actual patient deterioration. Emotional engagement resulting from increased fidelity can provide additional stress for student nurses with limited exposure to real patients. To determine the presence of increased stress with the standardized patient modality, this study compared the use of standardized patients (SP) with the use of high-fidelity simulators (HFS) during deteriorating patient simulations. Performance in managing deteriorating patients was also compared. It also explored student nurses' insights on the use of standardized patients and patient simulators in deteriorating patient simulations as preparation for clinical placement. Fifty-seven student nurses participated in a randomized controlled design study with pre- and post-tests to evaluate stress and performance in deteriorating patient simulations. Performance was assessed using the Rescuing A Patient in Deteriorating Situations (RAPIDS) rating tool. Stress was measured using salivary alpha-amylase levels. Fourteen participants who joined the randomized controlled component then participated in focus group discussions that elicited their insights on SP use in patient deterioration simulations. Analysis of covariance (ANCOVA) results showed no significant difference (p=0.744) between the performance scores of the SP and HFS groups in managing deteriorating patients. Amylase levels were also not significantly different (p=0.317) between the two groups. Stress in simulation, awareness of patient interactions, and realism were the main themes that resulted from the thematic analysis. Performance and stress in deteriorating patient simulations with standardized patients did not vary from similar simulations using high-fidelity patient simulators. Data from focus group interviews, however, suggested that the use of standardized patients was perceived to be valuable in preparing students for actual patient deterioration management. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. On predicting contamination levels of HALOE optics aboard UARS using direct simulation Monte Carlo

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael S.; Rault, Didier F. G.

    1993-01-01

    A three-dimensional version of the direct simulation Monte Carlo method is adapted to assess the contamination environment surrounding a highly detailed model of the Upper Atmosphere Research Satellite. Emphasis is placed on simulating a realistic, worst-case set of flowfield and surface conditions and geometric orientations in order to estimate an upper limit for the cumulative level of volatile organic molecular deposits at the aperture of the Halogen Occultation Experiment. Problems resolving species outgassing and vent flux rates that varied over many orders of magnitude were handled using species weighting factors. Results relating to contaminant cloud structure, cloud composition, and statistics of simulated molecules impinging on the target surface are presented, along with data related to code performance. Using procedures developed in standard contamination analyses, the cumulative level of volatile organic deposits on HALOE's aperture over the instrument's 35-month nominal data collection period is estimated to be about 2700A.

  6. Hierarchical Simulation to Assess Hardware and Software Dependability

    NASA Technical Reports Server (NTRS)

    Ries, Gregory Lawrence

    1997-01-01

    This thesis presents a method for conducting hierarchical simulations to assess system hardware and software dependability. The method is intended to model embedded microprocessor systems. A key contribution of the thesis is the idea of using fault dictionaries to propagate fault effects upward from the level of abstraction where a fault model is assumed to the system level where the ultimate impact of the fault is observed. A second important contribution is the analysis of the software behavior under faults as well as the hardware behavior. The simulation method is demonstrated and validated in four case studies analyzing Myrinet, a commercial, high-speed networking system. One key result from the case studies shows that the simulation method predicts the same fault impact 87.5% of the time as is obtained by similar fault injections into a real Myrinet system. Reasons for the remaining discrepancy are examined in the thesis. A second key result shows the reduction in the number of simulations needed due to the fault dictionary method. In one case study, 500 faults were injected at the chip level, but only 255 propagated to the system level. Of these 255 faults, 110 shared identical fault dictionary entries at the system level and so did not need to be resimulated. The necessary number of system-level simulations was therefore reduced from 500 to 145. Finally, the case studies show how the simulation method can be used to improve the dependability of the target system. The simulation analysis was used to add recovery to the target software for the most common fault propagation mechanisms that would cause the software to hang. After the modification, the number of hangs was reduced by 60% for fault injections into the real system.

  7. The Impact on Simulated Storm Structure and Intensity of Variations in the Lifted Condensation Level and the Level of Free Convection

    NASA Technical Reports Server (NTRS)

    McCaul, Eugene W., Jr.; Cohen, Charles; Arnold, James E. (Technical Monitor)

    2001-01-01

    The sensitivities of convective storm structure and intensity to changes in the altitudes of the prestorm environmental lifted condensation level and level of free convection axe studied using a full-physics three-dimensional cloud model. Matrices of simulations are conducted for a range of LCL=LFC altitudes, using a single moderately-sheared curved hodograph trace in conjunction with 1 convective available potential energy values of either 800 or 2000 J/kg, with the matrices consisting of all four combinations of two distinct choices of buoyancy and shear profile shape. For each value of CAPE, the LCL=LFC altitudes are also allowed to vary in a series of simulations based on the most highly compressed buoyancy and shear profiles for that CAPE, with the environmental buoyancy profile shape, subcloud equivalent potential temperature, subcloud lapse rates of temperature and moisture, and wind profile held fixed. For each CAPE, one final simulation is conducted using a near optimal LFC, but a lowered LCL, with a neutrally buoyant environmental thermal profile specified in between. Results show that, for the buoyancy-starved small-CAPE environments, the simulated storms are supercells and are generally largest and most intense when LCL=LFC altitudes lie in the approximate range 1.5-2.5 km above the surface. The simulations show similar trends for the shear-starved large-CAPE environments, except that conversion from supercell to multicell morphology frequently occurs when the LCL is high. For choices of LCL=LFC height within the optimal 1.5-2.5 km range, peak storm updraft overturning efficiency may approaches unity relative to parcel theory, while for lower LCL=LFC heights, overturning efficiency is reduced significantly. The enhancements of overturning efficiency and updraft diameter with increasing LFC height are shown to be the result of systematic increases in the mean equivalent potential temperature of the updraft at cloud base. For the shear-starved environments, the tendency for outflow dominance is eliminated, but a large overturning efficiency maintained, when a low LCL is used in conjunction with a high LFC. The result regarding outflow dominance at high LCL is consistent with expectations, but the beneficial effect of a high LFC on convective overturning efficiency has not previously been widely recognized. The simulation findings here also appear to be consistent with statistics from previous severe storm environment climatologies, but provide a new framework for interpreting those statistics.

  8. Estimating risks of heat strain by age and sex: a population-level simulation model.

    PubMed

    Glass, Kathryn; Tait, Peter W; Hanna, Elizabeth G; Dear, Keith

    2015-05-18

    Individuals living in hot climates face health risks from hyperthermia due to excessive heat. Heat strain is influenced by weather exposure and by individual characteristics such as age, sex, body size, and occupation. To explore the population-level drivers of heat strain, we developed a simulation model that scales up individual risks of heat storage (estimated using Myrup and Morgan's man model "MANMO") to a large population. Using Australian weather data, we identify high-risk weather conditions together with individual characteristics that increase the risk of heat stress under these conditions. The model identifies elevated risks in children and the elderly, with females aged 75 and older those most likely to experience heat strain. Risk of heat strain in males does not increase as rapidly with age, but is greatest on hot days with high solar radiation. Although cloudy days are less dangerous for the wider population, older women still have an elevated risk of heat strain on hot cloudy days or when indoors during high temperatures. Simulation models provide a valuable method for exploring population level risks of heat strain, and a tool for evaluating public health and other government policy interventions.

  9. New dimensions in surgical training: immersive virtual reality laparoscopic simulation exhilarates surgical staff.

    PubMed

    Huber, Tobias; Paschold, Markus; Hansen, Christian; Wunderling, Tom; Lang, Hauke; Kneist, Werner

    2017-11-01

    Virtual reality (VR) and head mount displays (HMDs) have been advanced for multimedia and information technologies but have scarcely been used in surgical training. Motion sickness and individual psychological changes have been associated with VR. The goal was to observe first experiences and performance scores using a new combined highly immersive virtual reality (IVR) laparoscopy setup. During the study, 10 members of the surgical department performed three tasks (fine dissection, peg transfer, and cholecystectomy) on a VR simulator. We then combined a VR HMD with the VR laparoscopic simulator and displayed the simulation on a 360° video of a laparoscopic operation to create an IVR laparoscopic simulation. The tasks were then repeated. Validated questionnaires on immersion and motion sickness were used for the study. Participants' times for fine dissection were significantly longer during the IVR session (regular: 86.51 s [62.57 s; 119.62 s] vs. IVR: 112.35 s [82.08 s; 179.40 s]; p = 0.022). The cholecystectomy task had higher error rates during IVR. Motion sickness did not occur at any time for any participant. Participants experienced a high level of exhilaration, rarely thought about others in the room, and had a high impression of presence in the generated IVR world. This is the first clinical and technical feasibility study using the full IVR laparoscopy setup combined with the latest laparoscopic simulator in a 360° surrounding. Participants were exhilarated by the high level of immersion. The setup enables a completely new generation of surgical training.

  10. Combination of electromagnetic measurements and FEM simulations for nondestructive determination of mechanical hardness

    NASA Astrophysics Data System (ADS)

    Gabi, Yasmine; Martins, Olivier; Wolter, Bernd; Strass, Benjamin

    2018-04-01

    The paper considers the Rockwell hardness investigation by finite element simulation in inspection situation of press hardened parts using the 3MA non-destructive testing system. The FEM model is based on robust strategy calculation which manages the issues of geometry and the time multiscale, as well as the local nonlinear hysteresis behavior of ferromagnetic materials. 3MA simulations are performed at high level operating point in order to saturate the soft microscopic surface soft layer of press hardened steel and access mainly to the bulk properties. 3MA measurements are validated by comparison with numerical simulations. Based on the simulation outputs, a virtual calibration is run. This result constitutes the first validation; the simulated calibration is in agreement with the conventional experimental data. As an outstanding highlight a correlation between magnetic quantities and hardness can be described via FEM simulated signals and shows high accuracy to the measured results.

  11. Contextual information influences diagnosis accuracy and decision making in simulated emergency medicine emergencies.

    PubMed

    McRobert, Allistair Paul; Causer, Joe; Vassiliadis, John; Watterson, Leonie; Kwan, James; Williams, Mark A

    2013-06-01

    It is well documented that adaptations in cognitive processes with increasing skill levels support decision making in multiple domains. We examined skill-based differences in cognitive processes in emergency medicine physicians, and whether performance was significantly influenced by the removal of contextual information related to a patient's medical history. Skilled (n=9) and less skilled (n=9) emergency medicine physicians responded to high-fidelity simulated scenarios under high- and low-context information conditions. Skilled physicians demonstrated higher diagnostic accuracy irrespective of condition, and were less affected by the removal of context-specific information compared with less skilled physicians. The skilled physicians generated more options, and selected better quality options during diagnostic reasoning compared with less skilled counterparts. These cognitive processes were active irrespective of the level of context-specific information presented, although high-context information enhanced understanding of the patients' symptoms resulting in higher diagnostic accuracy. Our findings have implications for scenario design and the manipulation of contextual information during simulation training.

  12. How to use MPI communication in highly parallel climate simulations more easily and more efficiently.

    NASA Astrophysics Data System (ADS)

    Behrens, Jörg; Hanke, Moritz; Jahns, Thomas

    2014-05-01

    In this talk we present a way to facilitate efficient use of MPI communication for developers of climate models. Exploitation of the performance potential of today's highly parallel supercomputers with real world simulations is a complex task. This is partly caused by the low level nature of the MPI communication library which is the dominant communication tool at least for inter-node communication. In order to manage the complexity of the task, climate simulations with non-trivial communication patterns often use an internal abstraction layer above MPI without exploiting the benefits of communication aggregation or MPI-datatypes. The solution for the complexity and performance problem we propose is the communication library YAXT. This library is built on top of MPI and takes high level descriptions of arbitrary domain decompositions and automatically derives an efficient collective data exchange. Several exchanges can be aggregated in order to reduce latency costs. Examples are given which demonstrate the simplicity and the performance gains for selected climate applications.

  13. Elucidating the mechanism of protein water channels by molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Grubmuller, Helmut

    2004-03-01

    Aquaporins are highly selective water channels. Molecular dynamics simulations of multiple water permeation events correctly predict the measured rate and explain at the atomic level why these membrane channels are so efficient, while blocking other small molecules, ions, and even protons. High efficiency is achieved through a carefully tailored balance of hydrogen bonds that the protein substitutes for the bulk interactions; selectivity is achieved mainly by electrostatic barriers.

  14. The effect of aircraft control forces on pilot performance during instrument landings in a flight simulator.

    PubMed

    Hewson, D J; McNair, P J; Marshall, R N

    2001-07-01

    Pilots may have difficulty controlling aircraft at both high and low force levels due to larger variability in force production at these force levels. The aim of this study was to measure the force variability and landing performance of pilots during an instrument landing in a flight simulator. There were 12 pilots who were tested while performing 5 instrument landings in a flight simulator, each of which required different control force inputs. Pilots can produce the least force when pushing the control column to the right, therefore the force levels for the landings were set relative to each pilot's maximum aileron-right force. The force levels for the landings were 90%, 60%, and 30% of maximal aileron-right force, normal force, and 25% of normal force. Variables recorded included electromyographic activity (EMG), aircraft control forces, aircraft attitude, perceived exertion and deviation from glide slope and heading. Multivariate analysis of variance was used to test for differences between landings. Pilots were least accurate in landing performance during the landing at 90% of maximal force (p < 0.05). There was also a trend toward decreased landing performance during the landing at 25% of normal force. Pilots were more variable in force production during the landings at 60% and 90% of maximal force (p < 0.05). Pilots are less accurate at performing instrument landings when control forces are high due to the increased variability of force production. The increase in variability at high force levels is most likely associated with motor unit recruitment, rather than rate coding. Aircraft designers need to consider the reduction in pilot performance at high force levels, as well as pilot strength limits when specifying new standards.

  15. Prospective randomized comparison of standard didactic lecture versus high-fidelity simulation for radiology resident contrast reaction management training.

    PubMed

    Wang, Carolyn L; Schopp, Jennifer G; Petscavage, Jonelle M; Paladin, Angelisa M; Richardson, Michael L; Bush, William H

    2011-06-01

    The objective of our study was to assess whether high-fidelity simulation-based training is more effective than traditional didactic lecture to train radiology residents in the management of contrast reactions. This was a prospective study of 44 radiology residents randomized into a simulation group versus a lecture group. All residents attended a contrast reaction didactic lecture. Four months later, baseline knowledge was assessed with a written test, which we refer to as the "pretest." After the pretest, the 21 residents in the lecture group attended a repeat didactic lecture and the 23 residents in the simulation group underwent high-fidelity simulation-based training with five contrast reaction scenarios. Next, all residents took a second written test, which we refer to as the "posttest." Two months after the posttest, both groups took a third written test, which we refer to as the "delayed posttest," and underwent performance testing with a high-fidelity severe contrast reaction scenario graded on predefined critical actions. There was no statistically significant difference between the simulation and lecture group pretest, immediate posttest, or delayed posttest scores. The simulation group performed better than the lecture group on the severe contrast reaction simulation scenario (p = 0.001). The simulation group reported improved comfort in identifying and managing contrast reactions and administering medications after the simulation training (p ≤ 0.04) and was more comfortable than the control group (p = 0.03), which reported no change in comfort level after the repeat didactic lecture. When compared with didactic lecture, high-fidelity simulation-based training of contrast reaction management shows equal results on written test scores but improved performance during a high-fidelity severe contrast reaction simulation scenario.

  16. Variability of individual genetic load: consequences for the detection of inbreeding depression.

    PubMed

    Restoux, Gwendal; Huot de Longchamp, Priscille; Fady, Bruno; Klein, Etienne K

    2012-03-01

    Inbreeding depression is a key factor affecting the persistence of natural populations, particularly when they are fragmented. In species with mixed mating systems, inbreeding depression can be estimated at the population level by regressing the average progeny fitness by the selfing rate of their mothers. We applied this method using simulated populations to investigate how population genetic parameters can affect the detection power of inbreeding depression. We simulated individual selfing rates and genetic loads from which we computed fitness values. The regression method yielded high statistical power, inbreeding depression being detected as significant (5 % level) in 92 % of the simulations. High individual variation in selfing rate and high mean genetic load led to better detection of inbreeding depression while high among-individual variation in genetic load made it more difficult to detect inbreeding depression. For a constant sampling effort, increasing the number of progenies while decreasing the number of individuals per progeny enhanced the detection power of inbreeding depression. We discuss the implication of among-mother variability of genetic load and selfing rate on inbreeding depression studies.

  17. Investigation of Blade Impulsive Noise on a Scaled Fully Articulated Rotor System

    NASA Technical Reports Server (NTRS)

    Scheiman, James; Hoad, Danny R.

    1977-01-01

    Helicopter impulsive noise tests were conducted in the Langley V/STOL tunnel with an articulated rotor system. The tests demonstrated that impulsive noise could be simulated for low-speed forward flight with low descent rates and also in the high-speed level flight. For the low forward speed condition, the noise level was highly sensitive to small changes in descent rate. For the high-speed condition, the noise level was increased with an increase in rotor thrust.

  18. Optical Simulation of Debye-Scherrer Crystal Diffraction

    ERIC Educational Resources Information Center

    Logiurato, F.; Gratton, L. M.; Oss, S.

    2008-01-01

    In this paper we describe and discuss simple, inexpensive optical experiments used to simulate x-ray and electron diffraction according to the Debye-Scherrer theory. The experiment can be used to address, at the high school level, important subjects related to fundamental quantum and solid-state physics.

  19. Feasibility and fidelity of practising surgical fixation on a virtual ulna bone

    PubMed Central

    LeBlanc, Justin; Hutchison, Carol; Hu, Yaoping; Donnon, Tyrone

    2013-01-01

    Background Surgical simulators provide a safe environment to learn and practise psychomotor skills. A goal for these simulators is to achieve high levels of fidelity. The purpose of this study was to develop a reliable surgical simulator fidelity questionnaire and to assess whether a newly developed virtual haptic simulator for fixation of an ulna has comparable levels of fidelity as Sawbones. Methods Simulator fidelity questionnaires were developed. We performed a stratified randomized study with surgical trainees. They performed fixation of the ulna using a virtual simulator and Sawbones. They completed the fidelity questionnaires after each procedure. Results Twenty-two trainees participated in the study. The reliability of the fidelity questionnaire for each separate domain (environment, equipment, psychological) was Cronbach α greater than 0.70, except for virtual environment. The Sawbones had significantly higher levels of fidelity than the virtual simulator (p < 0.001) with a large effect size difference (Cohen d < 1.3). Conclusion The newly developed fidelity questionnaire is a reliable tool that can potentially be used to determine the fidelity of other surgical simulators. Increasing the fidelity of this virtual simulator is required before its use as a training tool for surgical fixation. The virtual simulator brings with it the added benefits of repeated, independent safe use with immediate, objective feedback and the potential to alter the complexity of the skill. PMID:23883510

  20. Integrated Fault Diagnosis Algorithm for Motor Sensors of In-Wheel Independent Drive Electric Vehicles

    PubMed Central

    Jeon, Namju; Lee, Hyeongcheol

    2016-01-01

    An integrated fault-diagnosis algorithm for a motor sensor of in-wheel independent drive electric vehicles is presented. This paper proposes a method that integrates the high- and low-level fault diagnoses to improve the robustness and performance of the system. For the high-level fault diagnosis of vehicle dynamics, a planar two-track non-linear model is first selected, and the longitudinal and lateral forces are calculated. To ensure redundancy of the system, correlation between the sensor and residual in the vehicle dynamics is analyzed to detect and separate the fault of the drive motor system of each wheel. To diagnose the motor system for low-level faults, the state equation of an interior permanent magnet synchronous motor is developed, and a parity equation is used to diagnose the fault of the electric current and position sensors. The validity of the high-level fault-diagnosis algorithm is verified using Carsim and Matlab/Simulink co-simulation. The low-level fault diagnosis is verified through Matlab/Simulink simulation and experiments. Finally, according to the residuals of the high- and low-level fault diagnoses, fault-detection flags are defined. On the basis of this information, an integrated fault-diagnosis strategy is proposed. PMID:27973431

  1. VHDL simulation with access to transistor models

    NASA Technical Reports Server (NTRS)

    Gibson, J.

    1991-01-01

    Hardware description languages such as VHDL have evolved to aid in the design of systems with large numbers of elements and a wide range of electronic and logical abstractions. For high performance circuits, behavioral models may not be able to efficiently include enough detail to give designers confidence in a simulation's accuracy. One option is to provide a link between the VHDL environment and a transistor level simulation environment. The coupling of the Vantage Analysis Systems VHDL simulator and the NOVA simulator provides the combination of VHDL modeling and transistor modeling.

  2. Evaluating average and atypical response in radiation effects simulations

    NASA Astrophysics Data System (ADS)

    Weller, R. A.; Sternberg, A. L.; Massengill, L. W.; Schrimpf, R. D.; Fleetwood, D. M.

    2003-12-01

    We examine the limits of performing single-event simulations using pre-averaged radiation events. Geant4 simulations show the necessity, for future devices, to supplement current methods with ensemble averaging of device-level responses to physically realistic radiation events. Initial Monte Carlo simulations have generated a significant number of extremal events in local energy deposition. These simulations strongly suggest that proton strikes of sufficient energy, even those that initiate purely electronic interactions, can initiate device response capable in principle of producing single event upset or microdose damage in highly scaled devices.

  3. Airspace Technology Demonstration 2 (ATD-2) Integrated Surface and Airspace Simulation - Experiment Plan

    NASA Technical Reports Server (NTRS)

    Verma, Savita Arora

    2017-01-01

    This presentation describes the objectives and high level setup for the human-in-the-loop simulation of the integrated surface and airsapce simulation of the ATD-2 Integrated Arrival, Departure, Surface (IADS) system. The purpose of the simulation is to evaluate the functionality of the IADS system, including tactical surface scheduler, negotiation of departure times for the flights under Traffic Management Initiatives (TMIs), and data exchange between ATC Tower and airline Ramp. The same presentation was presented to serve the experiment review prior to the simulation.

  4. Vectorization for Molecular Dynamics on Intel Xeon Phi Corpocessors

    NASA Astrophysics Data System (ADS)

    Yi, Hongsuk

    2014-03-01

    Many modern processors are capable of exploiting data-level parallelism through the use of single instruction multiple data (SIMD) execution. The new Intel Xeon Phi coprocessor supports 512 bit vector registers for the high performance computing. In this paper, we have developed a hierarchical parallelization scheme for accelerated molecular dynamics simulations with the Terfoff potentials for covalent bond solid crystals on Intel Xeon Phi coprocessor systems. The scheme exploits multi-level parallelism computing. We combine thread-level parallelism using a tightly coupled thread-level and task-level parallelism with 512-bit vector register. The simulation results show that the parallel performance of SIMD implementations on Xeon Phi is apparently superior to their x86 CPU architecture.

  5. Soil Erosion in agro-industrially used Landscapes between High and Anti-Atlas

    NASA Astrophysics Data System (ADS)

    Peter, K. D.; Ries, J. B.; Marzolff, I.; d'Oleire-Oltmanns, S.

    2012-04-01

    The Souss basin is characterised by high population dynamics and changing land use. Extensive plantations of citrus fruits, bananas and vegetables in monocropping, mainly for the European market, replace the traditional mixed agriculture with small-area olive orchards and cereal fields. A precipitation of around 200 mm enforces the irrigation of cultivation by deep wells. The spatial vicinity of highly engineered irrigation areas, which are often created by land-levelling measures, and housing estates with highly active gully systems and rapid badland development presents a risk to both the agro-industrial land use and the population settlements. It is investigated whether the levelling measures influence surface runoff and soil erosion and thereby affect the further gully development. The influences of surface characteristics on runoff and soil erosion are analysed. Therefore 91 rainfall simulation experiments using a small portable rainfall simulator and 33 infiltrations by means of a single ring infiltrometer are carried out on seven test sites nearby the city of Taroudant. The rainfall simulations (30 minutes, 40 mm h-1) show an average runoff coefficient of between 54 and 59 % on test sites with land-levelling measures and average runoff coefficients ranging between 36 and 48 % on mostly non-levelled test sites. The average of soil erosion lies on levelled test sites between 52.1 and 81.8 g m-2, on non-levelled test-sites between 13.2 und 23.2 g m-2 per 30 minutes. Accordingly, all the test sites have a rather low infiltration capacity. This can also be confirmed by the low average infiltration depth of only 15.5 cm on levelled test sites. There is often a clear borderline at horizons with a high bulk density caused by compaction. In contrast, on non-levelled test sites, the average infiltration depth reaches 22.2 cm. Reinforcing factors for runoff and soil erosion are slope and soil crusts. Vegetation cover has a reducing influence on surface process activity. Medium rock fragment cover shows high rates of runoff and soil erosion. Hitherto collected data show an explicit difference between levelled and non-levelled test sites. Land-levelling measures clearly influence the generation of surface runoff and soil erosion and consequently, advance the further gully development.

  6. Risk assessment predictions of open dumping area after closure using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Pauzi, Nur Irfah Mohd; Radhi, Mohd Shahril Mat; Omar, Husaini

    2017-10-01

    Currently, there are many abandoned open dumping areas that were left without any proper mitigation measures. These open dumping areas could pose serious hazard to human and pollute the environment. The objective of this paper is to determine the risk assessment at the open dumping area after they has been closed using Monte Carlo Simulation method. The risk assessment exercise is conducted at the Kuala Lumpur dumping area. The rapid urbanisation of Kuala Lumpur coupled with increase in population lead to increase in waste generation. It leads to more dumping/landfill area in Kuala Lumpur. The first stage of this study involve the assessment of the dumping area and samples collections. It followed by measurement of settlement of dumping area using oedometer. The risk of the settlement is predicted using Monte Carlo simulation method. Monte Carlo simulation calculates the risk and the long-term settlement. The model simulation result shows that risk level of the Kuala Lumpur open dumping area ranges between Level III to Level IV i.e. between medium risk to high risk. These settlement (ΔH) is between 3 meters to 7 meters. Since the risk is between medium to high, it requires mitigation measures such as replacing the top waste soil with new sandy gravel soil. This will increase the strength of the soil and reduce the settlement.

  7. Heat shock protein 70 as a biomarker of heat stress in a simulated hot cockpit.

    PubMed

    Kumar, Yadunanda; Chawla, Anuj; Tatu, Utpal

    2003-07-01

    Fighter pilots are frequently exposed to high temperatures during high-speed low-level flight. Heat strain can result in temporary impairment of cognitive functions and when severe, loss of consciousness and consequent loss of life and equipment. Induction of stress proteins is a highly conserved stress response mechanism from bacteria to humans. Induced stress protein levels are known to be cytoprotective and have been correlated with stress tolerance. Although many studies on the heat shock response mechanisms have been performed in cell culture and animal model systems, there is very limited information on stress protein induction in human subjects. Heat shock proteins (Hsp), especially Hsp70, may be induced in human subjects exposed to high temperatures in a hot cockpit designed to simulate heat stress experienced in low flying sorties. Six healthy volunteers were subjected to heat stress at 55 degrees C in a high temperature cockpit simulator for a period of 1 h at 30% humidity. Physiological parameters such as oral and skin temperatures, heart rate, and sweat rate were monitored regularly during this time. The level of Hsp70 in leukocytes was examined before and after the heat exposure in each subject. Hsp70 was found to be significantly induced in all the six subjects exposed to heat stress. The level of induced Hsp70 appears to correlate with other strain indicators such as accumulative circulatory strain and Craig's modified index. The usefulness of Hsp70 as a molecular marker of heat stress in humans is discussed.

  8. Evaluation of East Asian climatology as simulated by seven coupled models

    NASA Astrophysics Data System (ADS)

    Jiang, Dabang; Wang, Huijun; Lang, Xianmei

    2005-07-01

    Using observation and reanalysis data throughout 1961 1990, the East Asian surface air temperature, precipitation and sea level pressure climatology as simulated by seven fully coupled atmosphere-ocean models, namely CCSR/NIES, CGCM2, CSIRO-Mk2, ECHAM4/OPYC3, GFDL-R30, HadCM3, and NCAR-PCM, are systematically evaluated in this study. It is indicated that the above models can successfully reproduce the annual and seasonal surface air temperature and precipitation climatology in East Asia, with relatively good performance for boreal autumn and annual mean. The models’ ability to simulate surface air temperature is more reliable than precipitation. In addition, the models can dependably capture the geographical distribution pattern of annual, boreal winter, spring and autumn sea level pressure in East Asia. In contrast, relatively large simulation errors are displayed when simulated boreal summer sea level pressure is compared with reanalysis data in East Asia. It is revealed that the simulation errors for surface air temperature, precipitation and sea level pressure are generally large over and around the Tibetan Plateau. No individual model is best in every aspect. As a whole, the ECHAM4/OPYC3 and HadCM3 performances are much better, whereas the CGCM2 is relatively poorer in East Asia. Additionally, the seven-model ensemble mean usually shows a relatively high reliability.

  9. Second Breakdown of 18V Grounded Gate NMOS induced by the Kirk Effect under Electrostatic Discharge

    NASA Astrophysics Data System (ADS)

    Jeon, Byung-Chul; Lee, Seung-Chul; Han, Min-Koo

    2003-09-01

    Electrostatic Discharge (ESD) failure mechanisms of 18V grounded gate NMOS (GGNMOS) for liquid crystal display driver IC (LDI) applications are investigated and effects of layout design parameters on the ESD immunity level are analyzed. Experimental results show that 18V GGNMOS exhibits snapback characteristics and the ESD immunity level is rather high when XO (N-drift overlap over n+ source/drain) is sufficiently large, while GGNMOS does not exhibit the sustaining region and is very vulnerable to ESD stress when XO is relatively small. Simulation results show that the ESD failure mechanism of 18V GGNMOS could be the low-temperature second breakdown induced by the Kirk effect. It is inferred that a certain amount of XO is indispensable to ensure snapback characteristics and high ESD immunity level. Simulation results also show that the ESD immunity level is increased as drain contact to gate space (DCGS) is increased.

  10. The development of a capability for aerodynamic testing of large-scale wing sections in a simulated natural rain environment

    NASA Technical Reports Server (NTRS)

    Bezos, Gaudy M.; Cambell, Bryan A.; Melson, W. Edward

    1989-01-01

    A research technique to obtain large-scale aerodynamic data in a simulated natural rain environment has been developed. A 10-ft chord NACA 64-210 wing section wing section equipped with leading-edge and trailing-edge high-lift devices was tested as part of a program to determine the effect of highly-concentrated, short-duration rainfall on airplane performance. Preliminary dry aerodynamic data are presented for the high-lift configuration at a velocity of 100 knots and an angle of attack of 18 deg. Also, data are presented on rainfield uniformity and rainfall concentration intensity levels obtained during the calibration of the rain simulation system.

  11. The effects of simulated patients and simulated gynecologic models on student anxiety in providing IUD services.

    PubMed

    Khadivzadeh, Talat; Erfanian, Fatemeh

    2012-10-01

    Midwifery students experience high levels of stress during their initial clinical practices. Addressing the learner's source of anxiety and discomfort can ease the learning experience and lead to better outcomes. The aim of this study was to find out the effect of a simulation-based course, using simulated patients and simulated gynecologic models on student anxiety and comfort while practicing to provide intrauterine device (IUD) services. Fifty-six eligible midwifery students were randomly allocated into simulation-based and traditional training groups. They participated in a 12-hour workshop in providing IUD services. The simulation group was trained through an educational program including simulated gynecologic models and simulated patients. The students in both groups then practiced IUD consultation and insertion with real patients in the clinic. The students' anxiety in IUD insertion was assessed using the "Spielberger anxiety test" and the "comfort in providing IUD services" questionnaire. There were significant differences between students in 2 aspects of anxiety including state (P < 0.001) and trait (P = 0.024) and the level of comfort (P = 0.000) in providing IUD services in simulation and traditional groups. "Fear of uterine perforation during insertion" was the most important cause of students' anxiety in providing IUD services, which was reported by 74.34% of students. Simulated patients and simulated gynecologic models are effective in optimizing students' anxiety levels when practicing to deliver IUD services. Therefore, it is recommended that simulated patients and simulated gynecologic models be used before engaging students in real clinical practice.

  12. Simulation of the High Performance Time to Digital Converter for the ATLAS Muon Spectrometer trigger upgrade

    NASA Astrophysics Data System (ADS)

    Meng, X. T.; Levin, D. S.; Chapman, J. W.; Zhou, B.

    2016-09-01

    The ATLAS Muon Spectrometer endcap thin-Resistive Plate Chamber trigger project compliments the New Small Wheel endcap Phase-1 upgrade for higher luminosity LHC operation. These new trigger chambers, located in a high rate region of ATLAS, will improve overall trigger acceptance and reduce the fake muon trigger incidence. These chambers must generate a low level muon trigger to be delivered to a remote high level processor within a stringent latency requirement of 43 bunch crossings (1075 ns). To help meet this requirement the High Performance Time to Digital Converter (HPTDC), a multi-channel ASIC designed by CERN Microelectronics group, has been proposed for the digitization of the fast front end detector signals. This paper investigates the HPTDC performance in the context of the overall muon trigger latency, employing detailed behavioral Verilog simulations in which the latency in triggerless mode is measured for a range of configurations and under realistic hit rate conditions. The simulation results show that various HPTDC operational configurations, including leading edge and pair measurement modes can provide high efficiency (>98%) to capture and digitize hits within a time interval satisfying the Phase-1 latency tolerance.

  13. In situ visualization for large-scale combustion simulations.

    PubMed

    Yu, Hongfeng; Wang, Chaoli; Grout, Ray W; Chen, Jacqueline H; Ma, Kwan-Liu

    2010-01-01

    As scientific supercomputing moves toward petascale and exascale levels, in situ visualization stands out as a scalable way for scientists to view the data their simulations generate. This full picture is crucial particularly for capturing and understanding highly intermittent transient phenomena, such as ignition and extinction events in turbulent combustion.

  14. A Simulated Journey

    ERIC Educational Resources Information Center

    Yoder, Lisa

    2006-01-01

    Students learn best when they interact with new information on a personal level. It is a challenge for teachers to tightly align student experiences with the standards assessed on high-stakes tests. To achieve this goal in social studies, the author has turned increasingly to simulations where students find such activities engaging, and their…

  15. Rater Training to Support High-Stakes Simulation-Based Assessments

    ERIC Educational Resources Information Center

    Feldman, Moshe; Lazzara, Elizabeth H.; Vanderbilt, Allison A.; DiazGranados, Deborah

    2012-01-01

    Competency-based assessment and an emphasis on obtaining higher-level outcomes that reflect physicians' ability to demonstrate their skills has created a need for more advanced assessment practices. Simulation-based assessments provide medical education planners with tools to better evaluate the 6 Accreditation Council for Graduate Medical…

  16. Robust state preparation in quantum simulations of Dirac dynamics

    NASA Astrophysics Data System (ADS)

    Song, Xue-Ke; Deng, Fu-Guo; Lamata, Lucas; Muga, J. G.

    2017-02-01

    A nonrelativistic system such as an ultracold trapped ion may perform a quantum simulation of a Dirac equation dynamics under specific conditions. The resulting Hamiltonian and dynamics are highly controllable, but the coupling between momentum and internal levels poses some difficulties to manipulate the internal states accurately in wave packets. We use invariants of motion to inverse engineer robust population inversion processes with a homogeneous, time-dependent simulated electric field. This exemplifies the usefulness of inverse-engineering techniques to improve the performance of quantum simulation protocols.

  17. An LED solar simulator for student labs

    NASA Astrophysics Data System (ADS)

    González, Manuel I.

    2017-05-01

    Measuring voltage-current and voltage-power curves of a photovoltaic module is a nice experiment for high school and undergraduate students. In labs where real sunlight is not available this experiment requires a solar simulator. A prototype of a simulator using LED lamps has been manufactured and tested, and a comparison with classical halogen simulators has been performed. It is found that LED light offers lower levels of irradiance, but much better performance in terms of module output for a given irradiance.

  18. Evaluation of Cloud-resolving and Limited Area Model Intercomparison Simulations using TWP-ICE Observations. Part 1: Deep Convective Updraft Properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varble, A. C.; Zipser, Edward J.; Fridlind, Ann

    2014-12-27

    Ten 3D cloud-resolving model (CRM) simulations and four 3D limited area model (LAM) simulations of an intense mesoscale convective system observed on January 23-24, 2006 during the Tropical Warm Pool – International Cloud Experiment (TWP-ICE) are compared with each other and with observed radar reflectivity fields and dual-Doppler retrievals of vertical wind speeds in an attempt to explain published results showing a high bias in simulated convective radar reflectivity aloft. This high bias results from ice water content being large, which is a product of large, strong convective updrafts, although hydrometeor size distribution assumptions modulate the size of this bias.more » Snow reflectivity can exceed 40 dBZ in a two-moment scheme when a constant bulk density of 100 kg m-3 is used. Making snow mass more realistically proportional to area rather than volume should somewhat alleviate this problem. Graupel, unlike snow, produces high biased reflectivity in all simulations. This is associated with large amounts of liquid water above the freezing level in updraft cores. Peak vertical velocities in deep convective updrafts are greater than dual-Doppler retrieved values, especially in the upper troposphere. Freezing of large rainwater contents lofted above the freezing level in simulated updraft cores greatly contributes to these excessive upper tropospheric vertical velocities. Strong simulated updraft cores are nearly undiluted, with some showing supercell characteristics. Decreasing horizontal grid spacing from 900 meters to 100 meters weakens strong updrafts, but not enough to match observational retrievals. Therefore, overly intense simulated updrafts may partly be a product of interactions between convective dynamics, parameterized microphysics, and large-scale environmental biases that promote different convective modes and strengths than observed.« less

  19. Quality assurance study of caries risk assessment performance by clinical faculty members in a school of dentistry.

    PubMed

    Rechmann, Peter; Featherstone, John D B

    2014-09-01

    The goal of this quality assurance study was to explore the decision making of clinical faculty members at the University of California, San Francisco School of Dentistry predoctoral dental clinic in terms of caries risk level assignment using the caries risk assessment (CRA) as part of the Caries Management by Risk Assessment (CAMBRA) concept. This research was done in part to determine if additional training and calibration were needed for these faculty members. The study tested the reliability and reproducibility of the caries risk levels assigned by different clinical teachers who completed CRA forms for simulated patients. In the first step, five clinical teachers assigned caries risk levels for thirteen simulated patients. Six months later, the same five plus an additional nine faculty members assigned caries risk levels to the same thirteen simulated and nine additional cases. While the intra-examiner reliability with weighted kappa strength of agreement was very high, the inter-examiner agreements with a gold standard were on average only moderate. In total, 20 percent of the presented high caries risk cases were underestimated at caries levels too low, even when obvious caries disease indicators were present. This study suggests that more consistent training and calibration of clinical faculty members as well as students are needed.

  20. Late Cretaceous climate simulations with different CO2 levels and subarctic gateway configurations: A model-data comparison

    NASA Astrophysics Data System (ADS)

    Niezgodzki, Igor; Knorr, Gregor; Lohmann, Gerrit; Tyszka, Jarosław; Markwick, Paul J.

    2017-09-01

    We investigate the impact of different CO2 levels and different subarctic gateway configurations on the surface temperatures during the latest Cretaceous using the Earth System Model COSMOS. The simulated temperatures are compared with the surface temperature reconstructions based on a recent compilation of the latest Cretaceous proxies. In our numerical experiments, the CO2 level ranges from 1 to 6 times the preindustrial (PI) CO2 level of 280 ppm. On a global scale, the most reasonable match between modeling and proxy data is obtained for the experiments with 3 to 5 × PI CO2 concentrations. However, the simulated low- (high-) latitude temperatures are too high (low) as compared to the proxy data. The moderate CO2 levels scenarios might be more realistic, if we take into account proxy data and the dead zone effect criterion. Furthermore, we test if the model-data discrepancies can be caused by too simplistic proxy-data interpretations. This is distinctly seen at high latitudes, where most proxies are biased toward summer temperatures. Additional sensitivity experiments with different ocean gateway configurations and constant CO2 level indicate only minor surface temperatures changes (< 1°C) on a global scale, with higher values (up to 8°C) on a regional scale. These findings imply that modeled and reconstructed temperature gradients are to a large degree only qualitatively comparable, providing challenges for the interpretation of proxy data and/or model sensitivity. With respect to the latter, our results suggest that an assessment of greenhouse worlds is best constrained by temperatures in the midlatitudes.

  1. Regional Sea Level Changes and Projections over North Pacific Driven by Air-sea interaction and Inter-basin Teleconnections

    NASA Astrophysics Data System (ADS)

    Li, X.; Zhu, J.; Xie, S. P.

    2017-12-01

    After the launch of the TOPEX/Poseidon satellite since 1992, a series of regional sea level changes have been observed. The northwestern Pacific is among the most rapid sea-level-rise regions all over the world. The rising peak occurs around 40°N, with the value reaching 15cm in the past two decades. Moreover, when investigating the projection of global sea level changes using CMIP5 rcp simulations, we found that the northwestern Pacific remains one of the most rapid sea-level-rise regions in the 21st century. To investigate the physical dynamics of present and future sea level changes over the Pacific, we performed a series of numerical simulations with a hierarchy of climate models, including earth system model, ocean model, and atmospheric models, with different complexity. Simulation results indicate that this regional sea level change during the past two decades is mainly caused by the shift of the Kuroshio, which is largely driven by the surface wind anomaly associated with an intensified and northward shifted north Pacific sub-tropical high. Further analysis and simulations show that these changes of sub-tropical high can be primarily attributed to the regional SST forcing from the Pacific Decadal Oscillation, and the remote SST forcings from the tropical Atlantic and the Indian Ocean. In the rcp scenario, on the other hand, two processes are crucial. Firstly, the meridional temperature SST gradient drives a northward wind anomaly across the equator, raising the sea level all over the North Pacific. Secondly, the atmospheric circulation changes around the sub-tropical Pacific further increase the sea level of the North Western Pacific. The coastal region around the Northwest Pacific is the most densely populated region around the world, therefore more attention must be paid to the sea level changes over this region, as suggested by our study.

  2. Multi-Scale Simulation of High Energy Density Ionic Liquids

    DTIC Science & Technology

    2007-06-19

    and simulation of ionic liquids (ILs). A polarizable model was developed to simulate ILs more accurately at the atomistic level. A multiscale coarse...propellant, 1- hydroxyethyl-4-amino-1, 2, 4-triazolium nitrate (HEATN), were studied with the all-atom polarizable model. The mechanism suggested for HEATN...with this AFOSR-supported project, a polarizable forcefield for the ionic liquids such as 1-ethyl-3-methylimidazolium nitrate (EMIM*/NO3-) was

  3. Predictions of Crystal Structures from First Principles

    DTIC Science & Technology

    2007-06-01

    RDX crystal in hoped that the problem could be resolved by the molecular dynamics simulations . The fully ab initio development of density functional... Molecular Dynamics Simulations of RDX i.e., without any use of experimental results (except that Crystal the geometry of monomers was derived from X-ray...applied in molecular dynamics simulations of the RDX system, due to its size, is intractable by any high-level ab crystal. We performed isothermal

  4. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    NASA Astrophysics Data System (ADS)

    Kirsch, L. E.; Bernstein, L. A.

    2018-06-01

    A new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the use of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.

  5. Development of Realistic Synthetic Data Products for the Tempo Geostationary Mission

    NASA Astrophysics Data System (ADS)

    Chan Miller, C.; Gonzalez Abad, G.; Zoogman, P.; Spurr, R. J. D.; Keller, C. A.; Liu, X.; Chance, K.

    2017-12-01

    TEMPO is a future geostationary satellite instrument designed to measure atmospheric pollution from solar backscatter over greater North America. Here we describe efforts to generate realistic synthetic level 1 (radiance) and level 2 (trace gas, aerosol and cloud) TEMPO observations, appropriate for retrieval algorithm validation and data assimilation observing system simulation experiments. The synthetic data are derived using a high resolution ( 12km x 12km) GEOS-5 GCM simulation with GEOS-Chem tropospheric chemistry combined with the VLIDORT radiative transfer model. The simulations include cloud and aerosol scattering, pressure- and temperature-dependent gas absorption, anisotropic surface reflectance derived from MODIS observations, solar-induced plant fluorescence derived from GOME-2 observations, and the Ring effect. We describe methods to speed up calculation of the synthetic level 2 products, and present a first validation of the TEMPO operational algorithms against the synthetic level 1 data.

  6. Retention of Vaginal Breech Delivery Skills Taught in Simulation.

    PubMed

    Stone, Heather; Crane, Joan; Johnston, Kathy; Craig, Catherine

    2018-02-01

    The optimal frequency of conducting simulation training for high-acuity, low-frequency events in obstetrics and gynaecology residency programs is unknown. This study evaluated retention over time of vaginal breech delivery skills taught in simulation, by comparing junior and senior residents. In addition, the residents' subjective comfort level to perform this skill clinically was assessed. This prospective cohort study included 22 obstetrics and gynaecology residents in a Canadian residency training program. Digital recordings were completed for pre-training, immediate post-training, and delayed (10-26 weeks later) post-training intervals of a vaginal breech delivery simulation, with skill assessment by a blinded observer using a binary checklist. Residents also completed questionnaires to assess their subjective comfort level at each interval. Junior and senior residents had significant improvements in vaginal breech delivery skills from the pre-training assessment to both the immediate post-training assessment (junior, P <0.001; senior, P <0.001) and the delayed post-training assessment (P <0.001 and P = 0.001, respectively). There was a significant decline in skills between the immediate and delayed post-training sessions for junior and senior residents (P = 0.003 and P <0.001, respectively). Both junior and senior residents gained more comfort immediately after the training (P <0.001 and P <0.001, respectively), without a significant change between immediate post-training and delayed post-training comfort levels (P = 0.19 and P = 0.11, respectively). Residents retained vaginal breech delivery skills taught in simulation 10-26 weeks later, although a decline in skills occurred over this time period. Comfort level was positively affected and retained. These results will aid in determining the frequency of simulation teaching for high-acuity, low-frequency events in a residency simulation curriculum. Copyright © 2018 Society of Obstetricians and Gynaecologists of Canada. Published by Elsevier Inc. All rights reserved.

  7. Grid Sensitivity Study for Slat Noise Simulations

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Choudhari, Meelan M.; Buning, Pieter G.

    2014-01-01

    The slat noise from the 30P/30N high-lift system is being investigated through computational fluid dynamics simulations in conjunction with a Ffowcs Williams-Hawkings acoustics solver. Many previous simulations have been performed for the configuration, and the case was introduced as a new category for the Second AIAA workshop on Benchmark problems for Airframe Noise Configurations (BANC-II). However, the cost of the simulations has restricted the study of grid resolution effects to a baseline grid and coarser meshes. In the present study, two different approaches are being used to investigate the effect of finer resolution of near-field unsteady structures. First, a standard grid refinement by a factor of two is used, and the calculations are performed by using the same CFL3D solver employed in the majority of the previous simulations. Second, the OVERFLOW code is applied to the baseline grid, but with a 5th-order upwind spatial discretization as compared with the second-order discretization used in the CFL3D simulations. In general, the fine grid CFL3D simulation and OVERFLOW calculation are in very good agreement and exhibit the lowest levels of both surface pressure fluctuations and radiated noise. Although the smaller scales resolved by these simulations increase the velocity fluctuation levels, they appear to mitigate the influence of the larger scales on the surface pressure. These new simulations are used to investigate the influence of the grid on unsteady high-lift simulations and to gain a better understanding of the physics responsible for the noise generation and radiation.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mizell, D.; Carter, S.

    In 1987, ISI's parallel distributed computing research group implemented a prototype sequential simulation system, designed for high-level simulation of candidate (Strategic Defense Initiative) architectures. A main design goal was to produce a simulation system that could incorporate non-trivial, executable representations of battle-management computations on each platform that were capable of controlling the actions of that platform throughout the simulation. The term BMA (battle manager abstraction) was used to refer to these simulated battle-management computations. In the authors first version of the simulator, the BMAs were C++ programs that we wrote and manually inserted into the system. Since then, they havemore » designed and implemented KMAC, a high-level language for writing BMA's. The KMAC preprocessor, built using the Unix tools lex 2 and YACC 3, translates KMAC source programs into C++ programs and passes them on to the C++ compiler. The KMAC preprocessor was incorporated into and operates under the control of the simulator's interactive user interface. After the KMAC preprocessor has translated a program into C++, the user interface system invokes the C++ compiler, and incorporates the resulting object code into the simulator load module for execution as part of a simulation run. This report describes the KMAC language and its preprocessor. Section 2 provides background material on the design of the simulation system that is necessary for understanding some of the parts of KMAC and some of the reasons it is structured the way it is. Section 3 describes the syntax and semantics of the language, and Section 4 discusses design of the preprocessor.« less

  9. VHDL-AMS modelling and simulation of a planar electrostatic micromotor

    NASA Astrophysics Data System (ADS)

    Endemaño, A.; Fourniols, J. Y.; Camon, H.; Marchese, A.; Muratet, S.; Bony, F.; Dunnigan, M.; Desmulliez, M. P. Y.; Overton, G.

    2003-09-01

    System level simulation results of a planar electrostatic micromotor, based on analytical models of the static and dynamic torque behaviours, are presented. A planar variable capacitance (VC) electrostatic micromotor designed, fabricated and tested at LAAS (Toulouse) in 1995 is simulated using the high level language VHDL-AMS (VHSIC (very high speed integrated circuits) hardware description language-analog mixed signal). The analytical torque model is obtained by first calculating the overlaps and capacitances between different electrodes based on a conformal mapping transformation. Capacitance values in the order of 10-16 F and torque values in the order of 10-11 N m have been calculated in agreement with previous measurements and simulations from this type of motor. A dynamic model has been developed for the motor by calculating the inertia coefficient and estimating the friction-coefficient-based values calculated previously for other similar devices. Starting voltage results obtained from experimental measurement are in good agreement with our proposed simulation model. Simulation results of starting voltage values, step response, switching response and continuous operation of the micromotor, based on the dynamic model of the torque, are also presented. Four VHDL-AMS blocks were created, validated and simulated for power supply, excitation control, micromotor torque creation and micromotor dynamics. These blocks can be considered as the initial phase towards the creation of intellectual property (IP) blocks for microsystems in general and electrostatic micromotors in particular.

  10. Impacts of high resolution data on traveler compliance levels in emergency evacuation simulations

    DOE PAGES

    Lu, Wei; Han, Lee D.; Liu, Cheng; ...

    2016-05-05

    In this article, we conducted a comparison study of evacuation assignment based on Traffic Analysis Zones (TAZ) and high resolution LandScan USA Population Cells (LPC) with detailed real world roads network. A platform for evacuation modeling built on high resolution population distribution data and activity-based microscopic traffic simulation was proposed. This platform can be extended to any cities in the world. The results indicated that evacuee compliance behavior affects evacuation efficiency with traditional TAZ assignment, but it did not significantly compromise the performance with high resolution LPC assignment. The TAZ assignment also underestimated the real travel time during evacuation. Thismore » suggests that high data resolution can improve the accuracy of traffic modeling and simulation. The evacuation manager should consider more diverse assignment during emergency evacuation to avoid congestions.« less

  11. The Goddard Space Flight Center (GSFC) robotics technology testbed

    NASA Technical Reports Server (NTRS)

    Schnurr, Rick; Obrien, Maureen; Cofer, Sue

    1989-01-01

    Much of the technology planned for use in NASA's Flight Telerobotic Servicer (FTS) and the Demonstration Test Flight (DTF) is relatively new and untested. To provide the answers needed to design safe, reliable, and fully functional robotics for flight, NASA/GSFC is developing a robotics technology testbed for research of issues such as zero-g robot control, dual arm teleoperation, simulations, and hierarchical control using a high level programming language. The testbed will be used to investigate these high risk technologies required for the FTS and DTF projects. The robotics technology testbed is centered around the dual arm teleoperation of a pair of 7 degree-of-freedom (DOF) manipulators, each with their own 6-DOF mini-master hand controllers. Several levels of safety are implemented using the control processor, a separate watchdog computer, and other low level features. High speed input/output ports allow the control processor to interface to a simulation workstation: all or part of the testbed hardware can be used in real time dynamic simulation of the testbed operations, allowing a quick and safe means for testing new control strategies. The NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) hierarchical control scheme, is being used as the reference standard for system design. All software developed for the testbed, excluding some of simulation workstation software, is being developed in Ada. The testbed is being developed in phases. The first phase, which is nearing completion, and highlights future developments is described.

  12. Thermo-hydrological and chemical (THC) modeling to support Field Test Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stauffer, Philip H.; Jordan, Amy B.; Harp, Dylan Robert

    This report summarizes ongoing efforts to simulate coupled thermal-hydrological-chemical (THC) processes occurring within a hypothetical high-level waste (HLW) repository in bedded salt. The report includes work completed since the last project deliverable, “Coupled model for heat and water transport in a high level waste repository in salt”, a Level 2 milestone submitted to DOE in September 2013 (Stauffer et al., 2013). Since the last deliverable, there have been code updates to improve the integration of the salt module with the pre-existing code and development of quality assurance (QA) tests of constitutive functions and precipitation/dissolution reactions. Simulations of bench-scale experiments, bothmore » historical and currently in the planning stages have been performed. Additional simulations have also been performed on the drift-scale model that incorporate new processes, such as an evaporation function to estimate water vapor removal from the crushed salt backfill and isotopic fractionation of water isotopes. Finally, a draft of a journal paper on the importance of clay dehydration on water availability is included as Appendix I.« less

  13. A Software Framework for Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.

    2008-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has a long history in developing simulations of experimental fixed-wing aircraft from gliders to suborbital vehicles on platforms ranging from desktop simulators to pilot-in-the-loop/aircraft-in-the-loop simulators. Regardless of the aircraft or simulator hardware, much of the software framework is common to all NASA Dryden simulators. Some of this software has withstood the test of time, but in recent years the push toward high-fidelity user-friendly simulations has resulted in some significant changes. This report presents an overview of the current NASA Dryden simulation software framework and capabilities with an emphasis on the new features that have permitted NASA to develop more capable simulations while maintaining the same staffing levels.

  14. Colossal Tooling Design: 3D Simulation for Ergonomic Analysis

    NASA Technical Reports Server (NTRS)

    Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid

    2003-01-01

    The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.

  15. Analog Design for Digital Deployment of a Serious Leadership Game

    NASA Technical Reports Server (NTRS)

    Maxwell, Nicholas; Lang, Tristan; Herman, Jeffrey L.; Phares, Richard

    2012-01-01

    This paper presents the design, development, and user testing of a leadership development simulation. The authors share lessons learned from using a design process for a board game to allow for quick and inexpensive revision cycles during the development of a serious leadership development game. The goal of this leadership simulation is to accelerate the development of leadership capacity in high-potential mid-level managers (GS-15 level) in a federal government agency. Simulation design included a mixed-method needs analysis, using both quantitative and qualitative approaches to determine organizational leadership needs. Eight design iterations were conducted, including three user testing phases. Three re-design iterations followed initial development, enabling game testing as part of comprehensive instructional events. Subsequent design, development and testing processes targeted digital application to a computer- and tablet-based environment. Recommendations include pros and cons of development and learner testing of an initial analog simulation prior to full digital simulation development.

  16. Propulsion simulation for magnetically suspended wind tunnel models

    NASA Technical Reports Server (NTRS)

    Joshi, Prakash B.; Beerman, Henry P.; Chen, James; Krech, Robert H.; Lintz, Andrew L.; Rosen, David I.

    1990-01-01

    The feasibility of simulating propulsion-induced aerodynamic effects on scaled aircraft models in wind tunnels employing Magnetic Suspension and Balance Systems. The investigation concerned itself with techniques of generating exhaust jets of appropriate characteristics. The objectives were to: (1) define thrust and mass flow requirements of jets; (2) evaluate techniques for generating propulsive gas within volume limitations imposed by magnetically-suspended models; (3) conduct simple diagnostic experiments for techniques involving new concepts; and (4) recommend experiments for demonstration of propulsion simulation techniques. Various techniques of generating exhaust jets of appropriate characteristics were evaluated on scaled aircraft models in wind tunnels with MSBS. Four concepts of remotely-operated propulsion simulators were examined. Three conceptual designs involving innovative adaptation of convenient technologies (compressed gas cylinders, liquid, and solid propellants) were developed. The fourth innovative concept, namely, the laser-assisted thruster, which can potentially simulate both inlet and exhaust flows, was found to require very high power levels for small thrust levels.

  17. The research of distributed interactive simulation based on HLA in coal mine industry inherent safety

    NASA Astrophysics Data System (ADS)

    Dou, Zhi-Wu

    2010-08-01

    To solve the inherent safety problem puzzling the coal mining industry, analyzing the characteristic and the application of distributed interactive simulation based on high level architecture (DIS/HLA), a new method is proposed for developing coal mining industry inherent safety distributed interactive simulation adopting HLA technology. Researching the function and structure of the system, a simple coal mining industry inherent safety is modeled with HLA, the FOM and SOM are developed, and the math models are suggested. The results of the instance research show that HLA plays an important role in developing distributed interactive simulation of complicated distributed system and the method is valid to solve the problem puzzling coal mining industry. To the coal mining industry, the conclusions show that the simulation system with HLA plays an important role to identify the source of hazard, to make the measure for accident, and to improve the level of management.

  18. A top-down design methodology and its implementation for VCSEL-based optical links design

    NASA Astrophysics Data System (ADS)

    Li, Jiguang; Cao, Mingcui; Cai, Zilong

    2005-01-01

    In order to find the optimal design for a given specification of an optical communication link, an integrated simulation of electronic, optoelectronic, and optical components of a complete system is required. It is very important to be able to simulate at both system level and detailed model level. This kind of model is feasible due to the high potential of Verilog-AMS language. In this paper, we propose an effective top-down design methodology and employ it in the development of a complete VCSEL-based optical links simulation. The principle of top-down methodology is that the development would proceed from the system to device level. To design a hierarchical model for VCSEL based optical links, the design framework is organized in three levels of hierarchy. The models are developed, and implemented in Verilog-AMS. Therefore, the model parameters are fitted to measured data. A sample transient simulation demonstrates the functioning of our implementation. Suggestions for future directions in top-down methodology used for optoelectronic systems technology are also presented.

  19. [Simulating measles and rubella elimination levels according to social stratification and interaction].

    PubMed

    Hincapié-Palacio, Doracelly; Ospina-Giraldo, Juan; Gómez-Arias, Rubén D; Uyi-Afuwape, Anthony; Chowell-Puente, Gerardo

    2010-02-01

    The study was aimed at comparing measles and rubella disease elimination levels in a homogeneous and heterogeneous population according to socioeconomic status with interactions amongst low- and high-income individuals and diversity in the average number of contacts amongst them. Effective reproductive rate simulations were deduced from a susceptibleinfected- recovered (SIR) mathematical model according to different immunisation rates using measles (1980 and 2005) and rubella (1998 and 2005) incidence data from Latin-America and the Caribbean. Low- and high-income individuals' social interaction and their average number of contacts were analysed by bipartite random network analysis. MAPLE 12 (Maplesoft Inc, Ontario Canada) software was used for making the simulations. The progress made in eliminating both diseases between both periods of time was reproduced in the socially-homogeneous population. Measles (2005) would be eliminated in high- and low-income groups; however, it would only be achieved in rubella (2005) if there were a high immunity rate amongst the low-income group. If the average number of contacts were varied, then rubella would not be eliminated, even with a 95 % immunity rate. Monitoring the elimination level in diseases like measles and rubella requires that socio-economic status be considered as well as the population's interaction pattern. Special attention should be paid to communities having diversity in their average number of contacts occurring in confined spaces such as displaced communities, prisons, educational establishments, or hospitals.

  20. A simulation-based study of HighSpeed TCP and its deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souza, Evandro de

    2003-05-01

    The current congestion control mechanism used in TCP has difficulty reaching full utilization on high speed links, particularly on wide-area connections. For example, the packet drop rate needed to fill a Gigabit pipe using the present TCP protocol is below the currently achievable fiber optic error rates. HighSpeed TCP was recently proposed as a modification of TCP's congestion control mechanism to allow it to achieve reasonable performance in high speed wide-area links. In this research, simulation results showing the performance of HighSpeed TCP and the impact of its use on the present implementation of TCP are presented. Network conditions includingmore » different degrees of congestion, different levels of loss rate, different degrees of bursty traffic and two distinct router queue management policies were simulated. The performance and fairness of HighSpeed TCP were compared to the existing TCP and solutions for bulk-data transfer using parallel streams.« less

  1. PV source based high voltage gain current fed converter

    NASA Astrophysics Data System (ADS)

    Saha, Soumya; Poddar, Sahityika; Chimonyo, Kudzai B.; Arunkumar, G.; Elangovan, D.

    2017-11-01

    This work involves designing and simulation of a PV source based high voltage gain, current fed converter. It deals with an isolated DC-DC converter which utilizes boost converter topology. The proposed converter is capable of high voltage gain and above all have very high efficiency levels as proved by the simulation results. The project intends to produce an output of 800 V dc from a 48 V dc input. The simulation results obtained from PSIM application interface were used to analyze the performance of the proposed converter. Transformer used in the circuit steps up the voltage as well as to provide electrical isolation between the low voltage and high voltage side. Since the converter involves high switching frequency of 100 kHz, ultrafast recovery diodes are employed in the circuitry. The major application of the project is for future modeling of solar powered electric hybrid cars.

  2. Vanadium Microalloyed High Strength Martensitic Steel Sheet for Hot-Dip Coating

    NASA Astrophysics Data System (ADS)

    Hutchinson, Bevis; Komenda, Jacek; Martin, David

    Cold rolled steels with various vanadium and nitrogen levels have been treated to simulate the application of galvanizing and galvannealing to hardened martensitic microstructures. Strength levels were raised 100-150MPa by alloying with vanadium, which mitigates the effect of tempering. This opens the way for new ultra-high strength steels with corrosion resistant coatings produced by hot dip galvanising.

  3. Scattering of sound by atmospheric turbulence predictions in a refractive shadow zone

    NASA Technical Reports Server (NTRS)

    Mcbride, Walton E.; Bass, Henry E.; Raspet, Richard; Gilbert, Kenneth E.

    1990-01-01

    According to ray theory, regions exist in an upward refracting atmosphere where no sound should be present. Experiments show, however, that appreciable sound levels penetrate these so-called shadow zones. Two mechanisms contribute to sound in the shadow zone: diffraction and turbulent scattering of sound. Diffractive effects can be pronounced at lower frequencies but are small at high frequencies. In the short wavelength limit, then, scattering due to turbulence should be the predominant mechanism involved in producing the sound levels measured in shadow zones. No existing analytical method includes turbulence effects in the prediction of sound pressure levels in upward refractive shadow zones. In order to obtain quantitative average sound pressure level predictions, a numerical simulation of the effect of atmospheric turbulence on sound propagation is performed. The simulation is based on scattering from randomly distributed scattering centers ('turbules'). Sound pressure levels are computed for many realizations of a turbulent atmosphere. Predictions from the numerical simulation are compared with existing theories and experimental data.

  4. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review.

    PubMed

    Issenberg, S Barry; McGaghie, William C; Petrusa, Emil R; Lee Gordon, David; Scalese, Ross J

    2005-01-01

    1969 to 2003, 34 years. Simulations are now in widespread use in medical education and medical personnel evaluation. Outcomes research on the use and effectiveness of simulation technology in medical education is scattered, inconsistent and varies widely in methodological rigor and substantive focus. Review and synthesize existing evidence in educational science that addresses the question, 'What are the features and uses of high-fidelity medical simulations that lead to most effective learning?'. The search covered five literature databases (ERIC, MEDLINE, PsycINFO, Web of Science and Timelit) and employed 91 single search terms and concepts and their Boolean combinations. Hand searching, Internet searches and attention to the 'grey literature' were also used. The aim was to perform the most thorough literature search possible of peer-reviewed publications and reports in the unpublished literature that have been judged for academic quality. Four screening criteria were used to reduce the initial pool of 670 journal articles to a focused set of 109 studies: (a) elimination of review articles in favor of empirical studies; (b) use of a simulator as an educational assessment or intervention with learner outcomes measured quantitatively; (c) comparative research, either experimental or quasi-experimental; and (d) research that involves simulation as an educational intervention. Data were extracted systematically from the 109 eligible journal articles by independent coders. Each coder used a standardized data extraction protocol. Qualitative data synthesis and tabular presentation of research methods and outcomes were used. Heterogeneity of research designs, educational interventions, outcome measures and timeframe precluded data synthesis using meta-analysis. Coding accuracy for features of the journal articles is high. The extant quality of the published research is generally weak. The weight of the best available evidence suggests that high-fidelity medical simulations facilitate learning under the right conditions. These include the following: providing feedback--51 (47%) journal articles reported that educational feedback is the most important feature of simulation-based medical education; repetitive practice--43 (39%) journal articles identified repetitive practice as a key feature involving the use of high-fidelity simulations in medical education; curriculum integration--27 (25%) journal articles cited integration of simulation-based exercises into the standard medical school or postgraduate educational curriculum as an essential feature of their effective use; range of difficulty level--15 (14%) journal articles address the importance of the range of task difficulty level as an important variable in simulation-based medical education; multiple learning strategies--11 (10%) journal articles identified the adaptability of high-fidelity simulations to multiple learning strategies as an important factor in their educational effectiveness; capture clinical variation--11 (10%) journal articles cited simulators that capture a wide variety of clinical conditions as more useful than those with a narrow range; controlled environment--10 (9%) journal articles emphasized the importance of using high-fidelity simulations in a controlled environment where learners can make, detect and correct errors without adverse consequences; individualized learning--10 (9%) journal articles highlighted the importance of having reproducible, standardized educational experiences where learners are active participants, not passive bystanders; defined outcomes--seven (6%) journal articles cited the importance of having clearly stated goals with tangible outcome measures that will more likely lead to learners mastering skills; simulator validity--four (3%) journal articles provided evidence for the direct correlation of simulation validity with effective learning. While research in this field needs improvement in terms of rigor and quality, high-fidelity medical simulations are educationally effective and simulation-based education complements medical education in patient care settings.

  5. SU-F-R-36: Validating Quantitative Radiomic Texture Features for Oncologic PET: A Digital Phantom Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, F; Yang, Y; Young, L

    Purpose: Radiomic texture features derived from the oncologic PET have recently been brought under intense investigation within the context of patient stratification and treatment outcome prediction in a variety of cancer types; however, their validity has not yet been examined. This work is aimed to validate radiomic PET texture metrics through the use of realistic simulations in the ground truth setting. Methods: Simulation of FDG-PET was conducted by applying the Zubal phantom as an attenuation map to the SimSET software package that employs Monte Carlo techniques to model the physical process of emission imaging. A total of 15 irregularly-shaped lesionsmore » featuring heterogeneous activity distribution were simulated. For each simulated lesion, 28 texture features in relation to the intensity histograms (GLIH), grey-level co-occurrence matrices (GLCOM), neighborhood difference matrices (GLNDM), and zone size matrices (GLZSM) were evaluated and compared with their respective values extracted from the ground truth activity map. Results: In reference to the values from the ground truth images, texture parameters appearing on the simulated data varied with a range of 0.73–3026.2% for GLIH-based, 0.02–100.1% for GLCOM-based, 1.11–173.8% for GLNDM-based, and 0.35–66.3% for GLZSM-based. For majority of the examined texture metrics (16/28), their values on the simulated data differed significantly from those from the ground truth images (P-value ranges from <0.0001 to 0.04). Features not exhibiting significant difference comprised of GLIH-based standard deviation, GLCO-based energy and entropy, GLND-based coarseness and contrast, and GLZS-based low gray-level zone emphasis, high gray-level zone emphasis, short zone low gray-level emphasis, long zone low gray-level emphasis, long zone high gray-level emphasis, and zone size nonuniformity. Conclusion: The extent to which PET imaging disturbs texture appearance is feature-dependent and could be substantial. It is thus advised that use of PET texture parameters for predictive and prognostic measurements in oncologic setting awaits further systematic and critical evaluation.« less

  6. Turbulent transition behavior in a separated and attached-flow low pressure turbine passage

    NASA Astrophysics Data System (ADS)

    Memory, Curtis L.

    Various time accurate numerical simulations were conducted on the aft-loaded L1A low pressure turbine airfoil operating at Reynolds numbers presenting with fully-stalled, non-reattaching laminar separation. The numerical solver TURBO was modified from its annular gas turbine simulation configuration to conduct simulations based on a linear cascade wind tunnel facility. Simulation results for the fully separated flow fields revealed various turbulent decay mechanisms. Separated shear layer decay, in the form of vortices forming between the shear layer and the blade wall, was shown to agree with experimental particle image velocimetry (PIV) data in terms of decay vortex size and core vorticity levels. These vortical structures eventually mix into a large recirculation zone which dominates the blade wake. Turbulent wake ex- tent and time-averaged velocity distributions agreed with PIV data. Steady-blowing vortex generating jet (VGJ) flow control was then applied to the flow fields. VGJ-induced streamwise vorticity was only present at blowing ratios above 1.5. VGJs actuated at the point of flow separation on the blade wall were more effective than those actuated downstream, within the separation zone. Pulsed-blowing VGJs at the upstream blade wall position were then actuated at various pulsing frequencies, duty cycles, and blowing ratios. These condition variations yielded differing levels of separation zone mitigation. Pulsed VGJs were shown to be more effective than steady blowing VGJs at conditions of high blowing ratio, high frequency, or high duty cycle, where blowing ratio had the highest level of influence on pulsed jet efficacy. The characteristic "calm zone" following the end of a given VGJ pulse was observed in simulations exhibiting high levels of separation zone mitigation. Numerical velocity fields near the blade wall during this calm zone was shown to be similar to velocity fields observed in PIV data. Instantaneous numerical vorticity fields indicated that the elimination of the separation zone directly downstream of the VGJ hole is a pri- mary indicator of pulsed VGJ efficacy. This indicator was confirmed by numerical time-averaged velocity magnitude rms data in the same region.

  7. Piloted Simulation Study of a Dual Thrust-Cutback Procedure for Reducing High-Speed Civil Transport Takeoff Noise Levels

    NASA Technical Reports Server (NTRS)

    Riley, Donald R.; Glaab, Louis J.; Brandon, Jay M.; Person, Lee H., Jr.; Glaab, Patricia C.

    1999-01-01

    A piloted simulation study was performed for the purpose of indicating the noise reduction benefits and piloting performance that could occur for a typical 4-engine high-Speed Civil Transport (HSCT) configuration during takeoff when a dual thrust-cutback procedure was employed with throttle operation under direct computer control. Two thrust cutbacks were employed with the first cutback performed while the vehicle was accelerating on the run-way and the second cutback performed at a distance farther downrange. Added vehicle performance improvements included the incorporation of high-lift increments into the aerodynamic database of the vehicle and the use of limited engine oversizing. Four single-stream turbine bypass engines that had no noise suppression of any kind were used with this configuration. This approach permitted establishing the additional noise suppression level that was needed to meet Federal Air Regulation Part 36 Stage 3 noise levels for subsonic commercial jet aircraft. Noise level results were calculated with the jet mixing and shock noise modules of the Aircraft Noise Prediction Program (ANOPP).

  8. HIGH-RESOLUTION DATASET OF URBAN CANOPY PARAMETERS FOR HOUSTON, TEXAS

    EPA Science Inventory

    Urban dispersion and air quality simulation models applied at various horizontal scales require different levels of fidelity for specifying the characteristics of the underlying surfaces. As the modeling scales approach the neighborhood level (~1 km horizontal grid spacing), the...

  9. Modification of Obstetric Emergency Simulation Scenarios for Realism in a Home-Birth Setting.

    PubMed

    Komorowski, Janelle; Andrighetti, Tia; Benton, Melissa

    2017-01-01

    Clinical competency and clear communication are essential for intrapartum care providers who encounter high-stakes, low-frequency emergencies. The challenge for these providers is to maintain infrequently used skills. The challenge is even more significant for midwives who manage births at home and who, due to low practice volume and low-risk clientele, may rarely encounter an emergency. In addition, access to team simulation may be limited for home-birth midwives. This project modified existing validated obstetric simulation scenarios for a home-birth setting. Twelve certified professional midwives (CPMs) in active home-birth practice participated in shoulder dystocia and postpartum hemorrhage simulations. The simulations were staged to resemble home-birth settings, supplies, and personnel. Fidelity (realism) of the simulations was assessed with the Simulation Design Scale, and satisfaction and self-confidence were assessed with the Student Satisfaction and Self-Confidence in Learning Scale. Both utilized a 5-point Likert scale, with higher scores suggesting greater levels of fidelity, participant satisfaction, and self-confidence. Simulation Design Scale scores indicated participants agreed fidelity was achieved for the home-birth setting, while scores on the Student Satisfaction and Self-Confidence in Learning indicated high levels of participant satisfaction and self-confidence. If offered without modification, simulation scenarios designed for use in hospitals may lose fidelity for home-birth midwives, particularly in the environmental and psychological components. Simulation is standard of care in most settings, an excellent vehicle for maintaining skills, and some evidence suggests it results in improved perinatal outcomes. Additional study is needed in this area to support home-birth providers in maintaining skills. This pilot study suggests that simulation scenarios intended for hospital use can be successfully adapted to the home-birth setting. © 2016 by the American College of Nurse-Midwives.

  10. Fatigue Tests with Random Flight Simulation Loading

    NASA Technical Reports Server (NTRS)

    Schijve, J.

    1972-01-01

    Crack propagation was studied in a full-scale wing structure under different simulated flight conditions. Omission of low-amplitude gust cycles had a small effect on the crack rate. Truncation of the infrequently occurring high-amplitude gust cycles to a lower level had a noticeably accelerating effect on crack growth. The application of fail-safe load (100 percent limit load) effectively stopped subsequent crack growth under resumed flight-simulation loading. In another flight-simulation test series on sheet specimens, the variables studied are the design stress level and the cyclic frequency of the random gust loading. Inflight mean stresses vary from 5.5 to 10.0 kg/sq mm. The effect of the stress level is larger for the 2024 alloy than for the 7075 alloy. Three frequencies were employed: namely, 10 cps, 1 cps, and 0.1 cps. The frequency effect was small. The advantages and limitations of flight-simulation tests are compared with those of alternative test procedures such as constant-amplitude tests, program tests, and random-load tests. Various testing purposes are considered. The variables of flight-simulation tests are listed and their effects are discussed. A proposal is made for performing systematic flight-simulation tests in such a way that the compiled data may be used as a source of reference.

  11. Modeling, simulation, and high-autonomy control of a Martian oxygen production plant

    NASA Technical Reports Server (NTRS)

    Schooley, L. C.; Cellier, F. E.; Wang, F.-Y.; Zeigler, B. P.

    1992-01-01

    Progress on a project for the development of a high-autonomy intelligent command and control architecture for process plants used to produce oxygen from local planetary resources is reported. A distributed command and control architecture is being developed and implemented so that an oxygen production plant, or other equipment, can be reliably commanded and controlled over an extended time period in a high-autonomy mode with high-level task-oriented teleoperation from one or several remote locations. During the reporting period, progress was made at all levels of the architecture. At the remote site, several remote observers can now participate in monitoring the plant. At the local site, a command and control center was introduced for increased flexibility, reliability, and robustness. The local control architecture was enhanced to control multiple tubes in parallel, and was refined for increased robustness. The simulation model was enhanced to full dynamics descriptions.

  12. Simulation of groundwater flow in the glacial aquifer system of northeastern Wisconsin with variable model complexity

    USGS Publications Warehouse

    Juckem, Paul F.; Clark, Brian R.; Feinstein, Daniel T.

    2017-05-04

    The U.S. Geological Survey, National Water-Quality Assessment seeks to map estimated intrinsic susceptibility of the glacial aquifer system of the conterminous United States. Improved understanding of the hydrogeologic characteristics that explain spatial patterns of intrinsic susceptibility, commonly inferred from estimates of groundwater age distributions, is sought so that methods used for the estimation process are properly equipped. An important step beyond identifying relevant hydrogeologic datasets, such as glacial geology maps, is to evaluate how incorporation of these resources into process-based models using differing levels of detail could affect resulting simulations of groundwater age distributions and, thus, estimates of intrinsic susceptibility.This report describes the construction and calibration of three groundwater-flow models of northeastern Wisconsin that were developed with differing levels of complexity to provide a framework for subsequent evaluations of the effects of process-based model complexity on estimations of groundwater age distributions for withdrawal wells and streams. Preliminary assessments, which focused on the effects of model complexity on simulated water levels and base flows in the glacial aquifer system, illustrate that simulation of vertical gradients using multiple model layers improves simulated heads more in low-permeability units than in high-permeability units. Moreover, simulation of heterogeneous hydraulic conductivity fields in coarse-grained and some fine-grained glacial materials produced a larger improvement in simulated water levels in the glacial aquifer system compared with simulation of uniform hydraulic conductivity within zones. The relation between base flows and model complexity was less clear; however, the relation generally seemed to follow a similar pattern as water levels. Although increased model complexity resulted in improved calibrations, future application of the models using simulated particle tracking is anticipated to evaluate if these model design considerations are similarly important for understanding the primary modeling objective - to simulate reasonable groundwater age distributions.

  13. NREL Partners with California's Santa Clara Valley Transportation Authority

    Science.gov Websites

    Automotive Systems Technology Simulator (FASTSim) - a high-level simulation tool for estimating the impact of , manager of the Transportation Systems Group in NREL's Transportation and Hydrogen Systems Center. " ;NREL will provide modeling and analytics on the potential scope of energy services associated with VGI

  14. Radar System Characterization Extended to Hardware-in-the-Loop Simulation for the Lab-Volt (Trademark) Training System

    DTIC Science & Technology

    2007-09-01

    devices such as klystrons , magnetrons, and traveling wave tubes. These microwave devices produce high power levels but may have limited bandwidths [20...diagram. The specific arrangement of components within a RADAR transmitter varies with operational specifications. Two options exist to produce high power ...cascading to generate sufficient power [20]. The second option to generate high power levels is to replace RF oscillators and amplifiers with microwave

  15. Alternative Chemical Cleaning Methods for High Level Waste Tanks: Simulant Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudisill, T.; King, W.; Hay, M.

    Solubility testing with simulated High Level Waste tank heel solids has been conducted in order to evaluate two alternative chemical cleaning technologies for the dissolution of sludge residuals remaining in the tanks after the exhaustion of mechanical cleaning and sludge washing efforts. Tests were conducted with non-radioactive pure phase metal reagents, binary mixtures of reagents, and a Savannah River Site PUREX heel simulant to determine the effectiveness of an optimized, dilute oxalic/nitric acid cleaning reagent and pure, dilute nitric acid toward dissolving the bulk non-radioactive waste components. A focus of this testing was on minimization of oxalic acid additions duringmore » tank cleaning. For comparison purposes, separate samples were also contacted with pure, concentrated oxalic acid which is the current baseline chemical cleaning reagent. In a separate study, solubility tests were conducted with radioactive tank heel simulants using acidic and caustic permanganate-based methods focused on the “targeted” dissolution of actinide species known to be drivers for Savannah River Site tank closure Performance Assessments. Permanganate-based cleaning methods were evaluated prior to and after oxalic acid contact.« less

  16. A simulator for airborne laser swath mapping via photon counting

    NASA Astrophysics Data System (ADS)

    Slatton, K. C.; Carter, W. E.; Shrestha, R.

    2005-06-01

    Commercially marketed airborne laser swath mapping (ALSM) instruments currently use laser rangers with sufficient energy per pulse to work with return signals of thousands of photons per shot. The resulting high signal to noise level virtually eliminates spurious range values caused by noise, such as background solar radiation and sensor thermal noise. However, the high signal level approach requires laser repetition rates of hundreds of thousands of pulses per second to obtain contiguous coverage of the terrain at sub-meter spatial resolution, and with currently available technology, affords little scalability for significantly downsizing the hardware, or reducing the costs. A photon-counting ALSM sensor has been designed by the University of Florida and Sigma Space, Inc. for improved topographic mapping with lower power requirements and weight than traditional ALSM sensors. Major elements of the sensor design are presented along with preliminary simulation results. The simulator is being developed so that data phenomenology and target detection potential can be investigated before the system is completed. Early simulations suggest that precise estimates of terrain elevation and target detection will be possible with the sensor design.

  17. Calibrating a hydraulic model using water levels derived from time series high-resolution Radarsat-2 synthetic aperture radar images and elevation data

    NASA Astrophysics Data System (ADS)

    Trudel, M.; Desrochers, N.; Leconte, R.

    2017-12-01

    Knowledge of water extent (WE) and level (WL) of rivers is necessary to calibrate and validate hydraulic models and thus to better simulate and forecast floods. Synthetic aperture radar (SAR) has demonstrated its potential for delineating water bodies, as backscattering of water is much lower than that of other natural surfaces. The ability of SAR to obtain information despite cloud cover makes it an interesting tool for temporal monitoring of water bodies. The delineation of WE combined with a high-resolution digital terrain model (DTM) allows extracting WL. However, most research using SAR data to calibrate hydraulic models has been carried out using one or two images. The objectives of this study is to use WL derived from time series high resolution Radarsat-2 SAR images for the calibration of a 1-D hydraulic model (HEC-RAS). Twenty high-resolution (5 m) Radarsat-2 images were acquired over a 40 km reach of the Athabasca River, in northern Alberta, Canada, between 2012 and 2016, covering both low and high flow regimes. A high-resolution (2m) DTM was generated combining information from LIDAR data and bathymetry acquired between 2008 and 2016 by boat surveying. The HEC-RAS model was implemented on the Athabasca River to simulate WL using cross-sections spaced by 100 m. An image histogram thresholding method was applied on each Radarsat-2 image to derive WE. WE were then compared against each cross-section to identify those were the slope of the banks is not too abrupt and therefore amenable to extract WL. 139 observations of WL at different locations along the river reach and with streamflow measurements were used to calibrate the HEC-RAS model. The RMSE between SAR-derived and simulated WL is under 0.35 m. Validation was performed using in situ observations of WL measured in 2008, 2012 and 2016. The RMSE between the simulated water levels calibrated with SAR images and in situ observations is less than 0.20 m. In addition, a critical success index (CSI) was performed to compare the WE simulated by HEC-RAS and that derived from SARs images. The CSI is higher than 0.85 for each date, which means that simulated WE is highly similar to the WE derived from SARs images. Thereby, the results of our analysis indicate that calibration of a hydraulic model can be performed from WL derived from time series of high-resolution SAR images.

  18. Development of a Turbofan Engine Simulation in a Graphical Simulation Environment

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Heui

    2003-01-01

    This paper presents the development of a generic component level model of a turbofan engine simulation with a digital controller, in an advanced graphical simulation environment. The goal of this effort is to develop and demonstrate a flexible simulation platform for future research in propulsion system control and diagnostic technology. A previously validated FORTRAN-based model of a modern, high-performance, military-type turbofan engine is being used to validate the platform development. The implementation process required the development of various innovative procedures, which are discussed in the paper. Open-loop and closed-loop comparisons are made between the two simulations. Future enhancements that are to be made to the modular engine simulation are summarized.

  19. Helicopter pilot scan techniques during low-altitude high-speed flight.

    PubMed

    Kirby, Christopher E; Kennedy, Quinn; Yang, Ji Hyun

    2014-07-01

    This study examined pilots' visual scan patterns during a simulated high-speed, low-level flight and how their scan rates related to flight performance. As helicopters become faster and more agile, pilots are expected to navigate at low altitudes while traveling at high speeds. A pilot's ability to interpret information from a combination of visual sources determines not only mission success, but also aircraft and crew survival. In a fixed-base helicopter simulator modeled after the U.S. Navy's MH-60S, 17 active-duty Navy helicopter pilots with varying total flight times flew and navigated through a simulated southern Californian desert course. Pilots' scan rate and fixation locations were monitored using an eye-tracking system while they flew through the course. Flight parameters, including altitude, were recorded using the simulator's recording system. Experienced pilots with more than 1000 total flight hours better maintained a constant altitude (mean altitude deviation = 48.52 ft, SD = 31.78) than less experienced pilots (mean altitude deviation = 73.03 ft, SD = 10.61) and differed in some aspects of their visual scans. They spent more time looking at the instrument display and less time looking out the window (OTW) than less experienced pilots. Looking OTW was associated with less consistency in maintaining altitude. Results may aid training effectiveness specific to helicopter aviation, particularly in high-speed low-level flight conditions.

  20. Comparative Implementation of High Performance Computing for Power System Dynamic Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Shuangshuang; Huang, Zhenyu; Diao, Ruisheng

    Dynamic simulation for transient stability assessment is one of the most important, but intensive, computations for power system planning and operation. Present commercial software is mainly designed for sequential computation to run a single simulation, which is very time consuming with a single processer. The application of High Performance Computing (HPC) to dynamic simulations is very promising in accelerating the computing process by parallelizing its kernel algorithms while maintaining the same level of computation accuracy. This paper describes the comparative implementation of four parallel dynamic simulation schemes in two state-of-the-art HPC environments: Message Passing Interface (MPI) and Open Multi-Processing (OpenMP).more » These implementations serve to match the application with dedicated multi-processor computing hardware and maximize the utilization and benefits of HPC during the development process.« less

  1. Piloted simulator evaluation of a relaxed static stability fighter at high angle-of-attack

    NASA Technical Reports Server (NTRS)

    Lapins, M.; Klein, R. W.; Martorella, R. P.; Cangelosi, J.; Neely, W. R., Jr.

    1982-01-01

    A piloted simulator evaluation of the stability and control characteristics of a relaxed static stability fighter aircraft was conducted using a differential maneuvering simulator. The primary purpose of the simulation was to evaluate the effectiveness of the limiters in preventing departure from controlled flight. The simulation was conducted in two phases, the first consisting of open-loop point stability evaluations over a range of subsonic flight conditions, the second concentrating on closed-loop tracking of a preprogrammed target in low speed, high angle-of-attack air combat maneuvering. The command limiters were effective in preventing departure from controlled flight while permitting competent levels of sustained maneuvering. Parametric variations during the study included the effects of pitch control power and wing-body static margin. Stability and control issues were clearly shown to impact the configuration design.

  2. Composite load spectra for select space propulsion structural components

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Kurth, R. E.; Ho, H.

    1991-01-01

    The objective of this program is to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen posts and system ducting. The first approach will consist of using state of the art probabilistic methods to describe the individual loading conditions and combinations of these loading conditions to synthesize the composite load spectra simulation. The second approach will consist of developing coupled models for composite load spectra simulation which combine the deterministic models for composite load dynamic, acoustic, high pressure, and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients will then be determined using advanced probabilistic simulation methods with and without strategically selected experimental data.

  3. The Application of High Energy Resolution Green's Functions to Threat Scenario Simulation

    NASA Astrophysics Data System (ADS)

    Thoreson, Gregory G.; Schneider, Erich A.

    2012-04-01

    Radiation detectors installed at key interdiction points provide defense against nuclear smuggling attempts by scanning vehicles and traffic for illicit nuclear material. These hypothetical threat scenarios may be modeled using radiation transport simulations. However, high-fidelity models are computationally intensive. Furthermore, the range of smuggler attributes and detector technologies create a large problem space not easily overcome by brute-force methods. Previous research has demonstrated that decomposing the scenario into independently simulated components using Green's functions can simulate photon detector signals with coarse energy resolution. This paper extends this methodology by presenting physics enhancements and numerical treatments which allow for an arbitrary level of energy resolution for photon transport. As a result, spectroscopic detector signals produced from full forward transport simulations can be replicated while requiring multiple orders of magnitude less computation time.

  4. An Air Operations Division Live, Virtual, and Constructive (LVC) Corporate Interoperability Standards Development Strategy

    DTIC Science & Technology

    2011-07-01

    Orlando, Florida, September 2009, 09F- SIW -090. [HLA (2000) - 1] - Modeling and Simulation Standard - High Level Architecture (HLA) – Framework and...Simulation Interoperability Workshop, Orlando, FL, USA, September 2009, 09F- SIW -023. [MaK] - www.mak.com [MIL-STD-3011] - MIL-STD-3011...Spring Simulation Interoperability Workshop, Norfolk, VA, USA, March 2007, 07S- SIW -072. [Ross] - Ross, P. and Clark, P. (2005), “Recommended

  5. Analytical simulation of nonlinear response to seismic test excitations of HDR-VKL (Heissdampfreaktor-Versuchskreislauf) piping system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinivasan, M.G.; Kot, C.A.; Mojtahed, M.

    The paper describes the analytical modeling, calculations, and results of the posttest nonlinear simulation of high-level seismic testing of the VKL piping system at the HDR Test Facility in Germany. One of the objectives of the tests was to evaluate analytical methods for calculating the nonlinear response of realistic piping systems subjected to high-level seismic excitation that would induce significant plastic deformation. Two out of the six different pipe-support configurations, (ranging from a stiff system with struts and snubbers to a very flexible system with practically no seismic supports), subjected to simulated earthquakes, were tested at very high levels. Themore » posttest nonlinear calculations cover the KWU configuration, a reasonably compliant system with only rigid struts. Responses for 800% safe-shutdown-earthquake loading were calculated using the NONPIPE code. The responses calculated with NONPIPE were found generally to have the same time trends as the measurements but contained under-, over-, and correct estimates of peak values, almost in equal proportions. The only exceptions were the peak strut forces, which were underestimated as a group. The scatter in the peak value estimate of displacements and strut forces was smaller than that for the strains. The possible reasons for the differences and the effort on further analysis are discussed.« less

  6. PTSD, Acute Stress, Performance and Decision-Making in Emergency Service Workers.

    PubMed

    Regehr, Cheryl; LeBlanc, Vicki R

    2017-06-01

    Despite research identifying high levels of stress and traumatic stress symptoms among those in the emergency services, the impact of these symptoms on performance and hence public safety remains uncertain. This review paper discusses a program of research that has examined the effects of prior critical incident exposure, acute stress, and current post-traumatic symptoms on the performance and decision-making during an acutely stressful event among police officers, police communicators, paramedics and child protection workers. Four studies, using simulation methods involving video simulators, human-patient simulators, and/or standardized patients, examined the performance of emergency workers in typical workplace situations related to their individual profession. Results varied according to level of acuity of stress and the nature of performance and decision-making. There was no evidence that PTSD had a direct impact on global performance on tasks for which emergency responders are highly trained. However, PTSD was associated with assessment of risk in situations that required professional judgement. Further, individuals experiencing PTSD symptoms reported higher levels of acute stress when faced with high acuity situations. Acute stress in these studies was associated with performance deficits on complex cognitive tasks, verbal memory impairment and heightened assessment of risk. © 2017 American Academy of Psychiatry and the Law.

  7. (-)-Epicatechin-induced recovery of mitochondria from simulated diabetes: Potential role of endothelial nitric oxide synthase.

    PubMed

    Ramírez-Sánchez, Israel; Rodríguez, Alonso; Moreno-Ulloa, Aldo; Ceballos, Guillermo; Villarreal, Francisco

    2016-05-01

    (-)-Epicatechin increases indicators associated with mitochondrial biogenesis in endothelial cells and myocardium. We investigated endothelial nitric oxide synthase involvement on (-)-epicatechin-induced increases in indicators associated with mitochondrial biogenesis in human coronary artery endothelial cells cultured in normal-glucose and high-glucose media, as well as to restore indicators of cardiac mitochondria from the effects of simulated diabetes. Here, we demonstrate the role of endothelial nitric oxide synthase on (-)-epicatechin-induced increases in mitochondrial proteins, transcription factors and sirtuin 1 under normal-glucose conditions. In simulated diabetes endothelial nitric oxide synthase function, mitochondrial function-associated and biogenesis-associated indicators were adversely impacted by high glucose, effects that were reverted by (-)-epicatechin. As an animal model of type 2 diabetes, 2-month old C57BL/6 mice were fed a high-fat diet for 16 weeks. Fasting and fed blood glucose levels were increased and NO plasma levels decreased. High-fat-diet-fed mice myocardium revealed endothelial nitric oxide synthase dysfunction, reduced mitochondrial activity and markers of mitochondrial biogenesis. The administration of 1 mg/kg (-)-epicatechin for 15 days by oral gavage shifted these endpoints towards control mice values. Results suggest that endothelial nitric oxide synthase mediates (-)-epicatechin-induced increases of indicators associated with mitochondrial biogenesis in endothelial cells. (-)-Epicatechin also counteracts the negative effects that high glucose or simulated type 2 diabetes has on endothelial nitric oxide synthase function. © The Author(s) 2016.

  8. Noninvasive CPAP with face mask: comparison among new air-entrainment masks and the Boussignac valve.

    PubMed

    Mistraletti, Giovanni; Giacomini, Matteo; Sabbatini, Giovanni; Pinciroli, Riccardo; Mantovani, Elena S; Umbrello, Michele; Palmisano, Debora; Formenti, Paolo; Destrebecq, Anne L L; Iapichino, Gaetano

    2013-02-01

    The performances of 2 noninvasive CPAP systems (high flow and low flow air-entrainment masks) were compared to the Boussignac valve in 3 different scenarios. Scenario 1: pneumatic lung simulator with a tachypnea pattern (tidal volume 800 mL at 40 breaths/min). Scenario 2: Ten healthy subjects studied during tidal breaths and tachypnea. Scenario 3: Twenty ICU subjects enrolled for a noninvasive CPAP session. Differences between set and effective CPAP level and F(IO(2)), as well as the lowest airway pressure and the pressure swing around the imposed CPAP level, were analyzed. The lowest airway pressure and swing were correlated to the pressure-time product (area of the airway pressure curve below the CPAP level) measured with the simulator. P(aO(2)) was a subject's further performance index. Lung simulator: Boussignac F(IO(2)) was 0.54, even if supplied with pure oxygen. The air-entrainment masks had higher swing than the Boussignac (P = .007). Pressure-time product correlated better with pressure swing (Spearman correlation coefficient [ρ] = 0.97) than with lowest airway pressure (ρ = 0.92). In healthy subjects, the high-flow air-entrainment mask showed lower difference between set and effective F(IO(2)) (P < .001), and lowest airway pressure (P < .001), compared to the Boussignac valve. In all measurements the Boussignac valve showed higher than imposed CPAP level (P < .001). In ICU subjects the high-flow mask had lower swing than the Boussignac valve (P = .03) with similar P(aO(2)) increase. High-flow air-entrainment mask showed the best performance in human subjects. During high flow demand, the Boussignac valve delivered lower than expected F(IO(2)) and showed higher dynamic hyper-pressurization than the air-entrainment masks. © 2013 Daedalus Enterprises.

  9. Atomization of coal water mixtures: evaluation of fuel nozzles and a cellulose gum simulant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosfjord, T.J.

    1985-03-01

    An experimental evaluation of four air-assist fuel nozzles has been conducted to determine atomization levels of coal-water mixture (CWM) fuels at operating conditions simulating a high pressure combustor. Two of the nozzles were commercial units marketed for use in atmospheric burners, while two nozzles were specially designed for CWM operation in a high pressure combustor. Sprays from all four injectors were characterized in tests performed over a range of liquid and air flowrates. Most of the tests were performed using a cellulose-gum water solution prepared to match the viscosity and drip characteristics of an available CWM. Atomization data acquired frommore » a limited test series using the CWM were found to be properly represented by the gum solution data. High levels of atomization (SMD about 10 micron) were achieved by two of the nozzles - one commercial unit and one special unit - at an assist airflow level corresponding to a nozzle air-fuel ratio between 0.6 - 0.8.« less

  10. High Fidelity BWR Fuel Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Su Jong

    This report describes the Consortium for Advanced Simulation of Light Water Reactors (CASL) work conducted for completion of the Thermal Hydraulics Methods (THM) Level 3 milestone THM.CFD.P13.03: High Fidelity BWR Fuel Simulation. High fidelity computational fluid dynamics (CFD) simulation for Boiling Water Reactor (BWR) was conducted to investigate the applicability and robustness performance of BWR closures. As a preliminary study, a CFD model with simplified Ferrule spacer grid geometry of NUPEC BWR Full-size Fine-mesh Bundle Test (BFBT) benchmark has been implemented. Performance of multiphase segregated solver with baseline boiling closures has been evaluated. Although the mean values of void fractionmore » and exit quality of CFD result for BFBT case 4101-61 agreed with experimental data, the local void distribution was not predicted accurately. The mesh quality was one of the critical factors to obtain converged result. The stability and robustness of the simulation was mainly affected by the mesh quality, combination of BWR closure models. In addition, the CFD modeling of fully-detailed spacer grid geometry with mixing vane is necessary for improving the accuracy of CFD simulation.« less

  11. Protecting High Energy Barriers: A New Equation to Regulate Boost Energy in Accelerated Molecular Dynamics Simulations.

    PubMed

    Sinko, William; de Oliveira, César Augusto F; Pierce, Levi C T; McCammon, J Andrew

    2012-01-10

    Molecular dynamics (MD) is one of the most common tools in computational chemistry. Recently, our group has employed accelerated molecular dynamics (aMD) to improve the conformational sampling over conventional molecular dynamics techniques. In the original aMD implementation, sampling is greatly improved by raising energy wells below a predefined energy level. Recently, our group presented an alternative aMD implementation where simulations are accelerated by lowering energy barriers of the potential energy surface. When coupled with thermodynamic integration simulations, this implementation showed very promising results. However, when applied to large systems, such as proteins, the simulation tends to be biased to high energy regions of the potential landscape. The reason for this behavior lies in the boost equation used since the highest energy barriers are dramatically more affected than the lower ones. To address this issue, in this work, we present a new boost equation that prevents oversampling of unfavorable high energy conformational states. The new boost potential provides not only better recovery of statistics throughout the simulation but also enhanced sampling of statistically relevant regions in explicit solvent MD simulations.

  12. Direct simulation Monte Carlo prediction of on-orbit contaminant deposit levels for HALOE

    NASA Technical Reports Server (NTRS)

    Woronowicz, Michael S.; Rault, Didier F. G.

    1994-01-01

    A three-dimensional version of the direct simulation Monte Carlo method is adapted to assess the contamination environment surrounding a highly detailed model of the Upper Atmosphere Research Satellite. Emphasis is placed on simulating a realistic, worst-case set of flow field and surface conditions and geometric orientations for the satellite in order to estimate an upper limit for the cumulative level of volatile organic molecular deposits at the aperture of the Halogen Occultation Experiment. A detailed description of the adaptation of this solution method to the study of the satellite's environment is also presented. Results pertaining to the satellite's environment are presented regarding contaminant cloud structure, cloud composition, and statistics of simulated molecules impinging on the target surface, along with data related to code performance. Using procedures developed in standard contamination analyses, along with many worst-case assumptions, the cumulative upper-limit level of volatile organic deposits on HALOE's aperture over the instrument's 35-month nominal data collection period is estimated at about 13,350 A.

  13. SRB thermal protection systems materials test results in an arc-heated nitrogen environment

    NASA Technical Reports Server (NTRS)

    Wojciechowski, C. J.

    1979-01-01

    The external surface of the Solid Rocket Booster (SRB) will experience imposed thermal and shear environments due to aerodynamic heating and radiation heating during launch, staging and reentry. This report is concerned with the performance of the various TPS materials during the staging maneuver. During staging, the wash from the Space Shuttle Main Engine (SSME) exhust plumes impose severe, short duration, thermal environments on the SRB. Five different SRB TPS materials were tested in the 1 MW Arc Plasma Generator (APG) facility. The maximum simulated heating rate obtained in the APG facility was 248 Btu/sq ft./sec, however, the test duration was such that the total heat was more than simulated. Similarly, some local high shear stress levels of 0.04 psia were not simulated. Most of the SSME plume impingement area on the SRB experiences shear stress levels of 0.02 psia and lower. The shear stress levels on the test specimens were between 0.021 and 0.008 psia. The SSME plume stagnation conditions were also simulated.

  14. Simulation of three-phase induction motor drives using indirect field oriented control in PSIM environment

    NASA Astrophysics Data System (ADS)

    Aziri, Hasif; Patakor, Fizatul Aini; Sulaiman, Marizan; Salleh, Zulhisyam

    2017-09-01

    This paper presents the simulation of three-phase induction motor drives using Indirect Field Oriented Control (IFOC) in PSIM environment. The asynchronous machine is well known about natural limitations fact of highly nonlinearity and complexity of motor model. In order to resolve these problems, the IFOC is applied to control the instantaneous electrical quantities such as torque and flux component. As FOC is controlling the stator current that represented by a vector, the torque component is aligned with d coordinate while the flux component is aligned with q coordinate. There are five levels of the incremental system are gradually built up to verify and testing the software module in the system. Indeed, all of system build levels are verified and successfully tested in PSIM environment. Moreover, the corresponding system of five build levels are simulated in PSIM environment which is user-friendly for simulation studies in order to explore the performance of speed responses based on IFOC algorithm for three-phase induction motor drives.

  15. Digital-model simulation of the glacial-outwash aquifer, Otter Creek-Dry Creek basin, Cortland County, New York

    USGS Publications Warehouse

    Cosner, O.J.; Harsh, J.F.

    1978-01-01

    The city of Cortland, New York, and surrounding areas obtain water from the highly productive glacial-outwash aquifer underlying the Otter Creek-Dry Creek basin. Pumpage from the aquifer in 1976 was approximately 6.3 million gallons per day and is expected to increase as a result of population growth and urbanization. A digital ground-water model that uses a finite-difference approximation technique to solve partial differential equations of flow through a porous medium was used to simulate the movement of water within the aquifer. The model was calibrated to equilibrium conditions by comparing water levels measured in the aquifer in March 1976 with those computed by the model. Then, from the simulated water-level surface for March, a transient-condition run was made to simulate the surface as measured in September 1976. Computed water levels presented as contours are generally in close agreement with potentiometric-surface maps prepared from field measurements of March and September 1976. (Woodard-USGS)

  16. Exploring the use of high-fidelity simulation training to enhance clinical skills.

    PubMed

    Ann Kirkham, Lucy

    2018-02-07

    The use of interprofessional simulation training to enhance nursing students' performance of technical and non-technical clinical skills is becoming increasingly common. Simulation training can involve the use of role play, virtual reality or patient simulator manikins to replicate clinical scenarios and assess the nursing student's ability to, for example, undertake clinical observations or work as part of a team. Simulation training enables nursing students to practise clinical skills in a safe environment. Effective simulation training requires extensive preparation, and debriefing is necessary following a simulated training session to review any positive or negative aspects of the learning experience. This article discusses a high-fidelity simulated training session that was used to assess a group of third-year nursing students and foundation level 1 medical students. This involved the use of a patient simulator manikin in a scenario that required the collaborative management of a deteriorating patient. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  17. Future Wave Height Situation estimated by the Latest Climate Scenario around Funafuti Atoll, Tuvalu

    NASA Astrophysics Data System (ADS)

    Sato, D.; Yokoki, H.; Kuwahara, Y.; Yamano, H.; Kayanne, H.; Okajima, H.; Kawamiya, M.

    2012-12-01

    Sea-level rise due to the global warming is significant phenomenon to coastal region in the world. Especially the atoll islands, which are low-lying and narrow, have high vulnerability against the sea-level rise. Recently the improved future climate projection (MIROC-ESM) was provided by JAMSTEC, which adopted the latest climate scenarios based on the RCP (Representative Concentration Pathway) of the green house gasses. Wave field simulation including the latest sea-level rise pathway by MIROC-ESM was conducted to understand the change of significant wave heights in Funafuti Atoll, Tuvalu, which was an important factor to manage the coast protection. MIROC-ESM provides monthly sea surface height in the fine gridded world (1.5 degree near the equator). Wave field simulation was conducted using the climate scenario of RCP45 in which the radioactive forcing of the end of 21st century was stabilized to 4.5 W/m2. Sea-level rise ratio of every 10 years was calculated based on the historical data set from 1850 to 2005 and the estimated data set from 2006 to 2100. In that case, the sea-level increases by 10cm after 100 years. In this study, the numerical simulation of wave field at the rate of sea-level rise was carried out using the SWAN model. The wave and wind conditions around Funafuti atoll is characterized by two seasons that are the trade (Apr. - Nov.) and non-trade (Jan. - Mar., Dec.) wind season. Then, we set up the two seasonal boundary conditions for one year's simulation, which were calculated from ECMWF reanalysis data. Simulated results of significant wave heights are analyzed by the increase rate (%) calculated from the base results (Average for 2000 - 2005) and the results of 2100. Calculated increase rate of the significant wave height for both seasons was extremely high on the reef-flat. Maximum increase rates of the trade and non-trade wind season were 1817% and 686%, respectively. The southern part of the atoll has high increasing rate through the two seasons. In the non-trade wind season, the northern tip and the southern part of the island were higher increase rate in the lagoon-side coasts, which was about 7%, and the average rate was 3.4%. On the other hand, the average rate in the trade wind season was 5.0%. Ocean side coast has high increase rate through the two seasons. Especially, the very large rate was calculated in the northern part of the Fongafale Island locally. The DEM data in the middle of Fongafale Island, which is most populated area in the island, showed that the northern oceanic coast has wide and high storm ridge and the increase rate was extremely large there. In such coasts, sea-level rise due to global warming has same effect as storm surge due to tropical cyclone in the point of increasing the sea-level, although the time scale of them is not same. Thus we can consider that the calculated area with large increase rate has already experienced the high wave due to tropical cyclone, which was enabled to construct the wide and high storm ridge. This result indicated that the effective coastal management under the sea-level rise needs to understand not only the quantitative estimation of the future situation but also the protect potential constructed by the present wave and wind condition.

  18. Atomistic Simulation and Electronic Structure of Lithium Doped Ionic Liquids: Structure, Transport, and Electrochemical Stability

    NASA Technical Reports Server (NTRS)

    Haskins, Justin B.; Bauschlicher, Charles W.; Lawson, John W.

    2015-01-01

    Zero-temperature density functional theory (DFT), density functional theory molecular dynamics (DFT-MD), and classical molecular dynamics using polarizable force fields (PFF-MD) are employed to evaluate the influence of Lithium ion on the structure, transport, and electrochemical stability of three potential ionic liquid electrolytes: N--methyl-N-butylpyrrolidinium bis(trifluoromethanesulfonyl)imide ([pyr14][TFSI]), N--methyl-N-propylpyrrolidinium bis(fluorosulfonyl)imide ([pyr13][FSI]), and 1-ethyl-3--methylimidazolium boron tetrafluoride ([EMIM][BF4]). We characterize the Lithium ion solvation shell through zero-temperature DFT simulations of [Li(Anion)sub n](exp n-1) -clusters, DFT-MD simulations of isolated lithium ions in small ionic liquid systems, and PFF-MD simulations with high Li-doping levels in large ionic liquid systems. At low levels of Li-salt doping, highly stable solvation shells having 2-3 anions are seen in both [pyr14][TFSI] and [pyr13][FSI], while solvation shells with 4 anions dominate in [EMIM][BF sub 4]. At higher levels of doping, we find the formation of complex Li-network structures that increase the frequency of 4 anion-coordinated solvation shells. A comparison of computational and experimental Raman spectra for a wide range of [Li(Anion) sub n](exp n -1) - clusters shows that our proposed structures are consistent with experiment. We estimate the ion diffusion coefficients and quantify both size and simulation time effects. We find estimates of lithium ion diffusion are a reasonable order of magnitude and can be corrected for simulation time effects. Simulation size, on the other hand, is also important, with diffusion coefficients from long PFF-MD simulations of small cells having 20-40% error compared to large-cell values. Finally, we compute the electrochemical window using differences in electronic energy levels of both isolated cation/anion pairs and small ionic liquid systems with Li-salt doping. The single pair and liquid-phase systems provide similar estimates of electrochemical window, while Li-doping in the liquid-phase systems results in electrochemical windows little changed from the neat systems. Pure and hybrid functionals systematically provide an upper and lower bound, respectively, to the experimental electrochemical window for the systems studied here.

  19. Meteorological Simulations of Ozone Episode Case Days during the 1996 Paso del Norte Ozone Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, M.J.; Costigan, K.; Muller, C.

    1999-02-01

    Meteorological simulations centered around the border cities of El Paso and Ciudad Juarez have been performed during an ozone episode that occurred on Aug. 13,1996 during the 1996 Paso del Norte Ozone Study field campaign. Simulations were petiormed using the HOTMAC mesoscale meteorological model using a 1,2,4, and 8 km horizontal grid size nested mesh system. Investigation of the vertical structure and evolution of the atmospheric boundary layer for the Aug. 11-13 time period is emphasized in this paper. Comparison of model-produced wind speed profiles to rawirisonde and radar profiler measurements shows reasonable agreement. A persistent upper-level jet was capturedmore » in the model simulations through data assimilation. In the evening hours, the model was not able to produce the strong wind direction shear seen in the radar wind profiles. Based on virtual potential temperature profile comparisons, the model appears to correctly simulate the daytime growth of the convective mixed layer. However, the model underestimates the cooling of the surface layer at night. We found that the upper-level jet significantly impacted the turbulence structure of the boundary layer, leading to relatively high turbulent kinetic energy (tke) values aloft at night. The model indicates that these high tke values aloft enhance the mid-morning growth of the boundary layer. No upper-level turbulence measurements were available to verify this finding, however. Radar profiler-derived mixing heights do indicate relatively rapid morning growth of the mixed layer.« less

  20. Constrained Local UniversE Simulations: a Local Group factory

    NASA Astrophysics Data System (ADS)

    Carlesi, Edoardo; Sorce, Jenny G.; Hoffman, Yehuda; Gottlöber, Stefan; Yepes, Gustavo; Libeskind, Noam I.; Pilipenko, Sergey V.; Knebe, Alexander; Courtois, Hélène; Tully, R. Brent; Steinmetz, Matthias

    2016-05-01

    Near-field cosmology is practised by studying the Local Group (LG) and its neighbourhood. This paper describes a framework for simulating the `near field' on the computer. Assuming the Λ cold dark matter (ΛCDM) model as a prior and applying the Bayesian tools of the Wiener filter and constrained realizations of Gaussian fields to the Cosmicflows-2 (CF2) survey of peculiar velocities, constrained simulations of our cosmic environment are performed. The aim of these simulations is to reproduce the LG and its local environment. Our main result is that the LG is likely a robust outcome of the ΛCDMscenario when subjected to the constraint derived from CF2 data, emerging in an environment akin to the observed one. Three levels of criteria are used to define the simulated LGs. At the base level, pairs of haloes must obey specific isolation, mass and separation criteria. At the second level, the orbital angular momentum and energy are constrained, and on the third one the phase of the orbit is constrained. Out of the 300 constrained simulations, 146 LGs obey the first set of criteria, 51 the second and 6 the third. The robustness of our LG `factory' enables the construction of a large ensemble of simulated LGs. Suitable candidates for high-resolution hydrodynamical simulations of the LG can be drawn from this ensemble, which can be used to perform comprehensive studies of the formation of the LG.

  1. Simulation of long-range transport aerosols from the Asian Continent to Taiwan by a southward Asian high-pressure system.

    PubMed

    Chuang, Ming-Tung; Fu, Joshua S; Jang, Carey J; Chan, Chang-Chuan; Ni, Pei-Cheng; Lee, Chung-Te

    2008-11-15

    Aerosol is frequently transported by a southward high-pressure system from the Asian Continent to Taiwan and had been recorded a 100% increase in mass level compared to non-event days from 2002 to 2005. During this time period, PM2.5 sulfate was found to increase as high as 155% on event days as compared to non-event days. In this study, Asian emission estimations, Taiwan Emission Database System (TEDS), and meteorological simulation results from the fifth-generation Mesoscale Model (MM5) were used as inputs for the Community Multiscale Air Quality (CMAQ) model to simulate a long-range transport of PM2.5 event in a southward high-pressure system from the Asian Continent to Taiwan. The simulation on aerosol mass level and the associated aerosol components were found within a reasonable accuracy. During the transport process, the percentage of semi-volatile PM2.5 organic carbon in PM2.5 plume only slightly decreased from 22-24% in Shanghai to 21% near Taiwan. However, the percentage of PM2.5 nitrate in PM2.5 decreased from 16-25% to 1%. In contrast, the percentage of PM2.5 sulfate in PM2.5 increased from 16-19% to 35%. It is interesting to note that the percentage of PM2.5 ammonium and PM2.5 elemental carbon in PM2.5 remained nearly constant. Simulation results revealed that transported pollutants dominate the air quality in Taipei when the southward high-pressure system moved to Taiwan. Such condition demonstrates the dynamic chemical transformation of pollutants during the transport process from continental origin over the sea area and to the downwind land.

  2. Molecular Modeling of Water Interfaces: From Molecular Spectroscopy to Thermodynamics.

    PubMed

    Nagata, Yuki; Ohto, Tatsuhiko; Backus, Ellen H G; Bonn, Mischa

    2016-04-28

    Understanding aqueous interfaces at the molecular level is not only fundamentally important, but also highly relevant for a variety of disciplines. For instance, electrode-water interfaces are relevant for electrochemistry, as are mineral-water interfaces for geochemistry and air-water interfaces for environmental chemistry; water-lipid interfaces constitute the boundaries of the cell membrane, and are thus relevant for biochemistry. One of the major challenges in these fields is to link macroscopic properties such as interfacial reactivity, solubility, and permeability as well as macroscopic thermodynamic and spectroscopic observables to the structure, structural changes, and dynamics of molecules at these interfaces. Simulations, by themselves, or in conjunction with appropriate experiments, can provide such molecular-level insights into aqueous interfaces. In this contribution, we review the current state-of-the-art of three levels of molecular dynamics (MD) simulation: ab initio, force field, and coarse-grained. We discuss the advantages, the potential, and the limitations of each approach for studying aqueous interfaces, by assessing computations of the sum-frequency generation spectra and surface tension. The comparison of experimental and simulation data provides information on the challenges of future MD simulations, such as improving the force field models and the van der Waals corrections in ab initio MD simulations. Once good agreement between experimental observables and simulation can be established, the simulation can be used to provide insights into the processes at a level of detail that is generally inaccessible to experiments. As an example we discuss the mechanism of the evaporation of water. We finish by presenting an outlook outlining four future challenges for molecular dynamics simulations of aqueous interfacial systems.

  3. Great Britain Storm Surge Modeling for a 10,000-Year Stochastic Catalog with the Effect of Sea Level Rise

    NASA Astrophysics Data System (ADS)

    Keshtpoor, M.; Carnacina, I.; Blair, A.; Yablonsky, R. M.

    2017-12-01

    Storm surge caused by Extratropical Cyclones (ETCs) has significantly impacted not only the life of private citizens but also the insurance and reinsurance industry in Great Britain. The storm surge risk assessment requires a larger dataset of storms than the limited recorded historical ETCs. Thus, historical ETCs were perturbed to generate a 10,000-year stochastic catalog that accounts for surge-generating ETCs in the study area with return periods from one year to 10,000 years. Delft3D-Flexible Mesh hydrodynamic model was used to numerically simulate the storm surge along the Great Britain coastline. A nested grid technique was used to increase the simulation grid resolution up to 200 m near the highly populated coastal areas. Coarse and fine mesh models were calibrated and validated using historical recorded water elevations. Then, numerical simulations were performed on a 10,000-year stochastic catalog. The 50-, 100-, and 500-year return period maps were generated for Great Britain coastal areas. The corresponding events with return periods of 50-, 100-, and 500-years in Humber Bay and Thames River coastal areas were identified, and simulated with the consideration of projected sea level rises to reveal the effect of rising sea levels on the inundation return period maps in two highly-populated coastal areas. Finally, the return period of Storm Xaver (2013) was determined with and without the effect of rising sea levels.

  4. Accuracy of Handheld Blood Glucose Meters at High Altitude

    PubMed Central

    de Vries, Suzanna T.; Fokkert, Marion J.; Dikkeschei, Bert D.; Rienks, Rienk; Bilo, Karin M.; Bilo, Henk J. G.

    2010-01-01

    Background Due to increasing numbers of people with diabetes taking part in extreme sports (e.g., high-altitude trekking), reliable handheld blood glucose meters (BGMs) are necessary. Accurate blood glucose measurement under extreme conditions is paramount for safe recreation at altitude. Prior studies reported bias in blood glucose measurements using different BGMs at high altitude. We hypothesized that glucose-oxidase based BGMs are more influenced by the lower atmospheric oxygen pressure at altitude than glucose dehydrogenase based BGMs. Methodology/Principal Findings Glucose measurements at simulated altitude of nine BGMs (six glucose dehydrogenase and three glucose oxidase BGMs) were compared to glucose measurement on a similar BGM at sea level and to a laboratory glucose reference method. Venous blood samples of four different glucose levels were used. Moreover, two glucose oxidase and two glucose dehydrogenase based BGMs were evaluated at different altitudes on Mount Kilimanjaro. Accuracy criteria were set at a bias <15% from reference glucose (when >6.5 mmol/L) and <1 mmol/L from reference glucose (when <6.5 mmol/L). No significant difference was observed between measurements at simulated altitude and sea level for either glucose oxidase based BGMs or glucose dehydrogenase based BGMs as a group phenomenon. Two GDH based BGMs did not meet set performance criteria. Most BGMs are generally overestimating true glucose concentration at high altitude. Conclusion At simulated high altitude all tested BGMs, including glucose oxidase based BGMs, did not show influence of low atmospheric oxygen pressure. All BGMs, except for two GDH based BGMs, performed within predefined criteria. At true high altitude one GDH based BGM had best precision and accuracy. PMID:21103399

  5. Comparing self-guided learning and educator-guided learning formats for simulation-based clinical training.

    PubMed

    Brydges, Ryan; Carnahan, Heather; Rose, Don; Dubrowski, Adam

    2010-08-01

    In this paper, we tested the over-arching hypothesis that progressive self-guided learning offers equivalent learning benefit vs. proficiency-based training while limiting the need to set proficiency standards. We have shown that self-guided learning is enhanced when students learn on simulators that progressively increase in fidelity during practice. Proficiency-based training, a current gold-standard training approach, requires achievement of a criterion score before students advance to the next learning level. Baccalaureate nursing students (n = 15/group) practised intravenous catheterization using simulators that differed in fidelity (i.e. students' perceived realism). Data were collected in 2008. Proficiency-based students advanced from low- to mid- to high-fidelity after achieving a proficiency criterion at each level. Progressive students self-guided their progression from low- to mid- to high-fidelity. Yoked control students followed an experimenter-defined progressive practice schedule. Open-ended students moved freely between the simulators. One week after practice, blinded experts evaluated students' skill transfer on a standardized patient simulation. Group differences were examined using analyses of variance. Proficiency-based students scored highest on the high-fidelity post-test (effect size = 1.22). An interaction effect showed that the Progressive and Open-ended groups maintained their performance from post-test to transfer test, whereas the Proficiency-based and Yoked control groups experienced a significant decrease (P < 0.05). Surprisingly, most Open-ended students (73%) chose the progressive practice schedule. Progressive training and proficiency-based training resulted in equivalent transfer test performance, suggesting that progressive students effectively self-guided when to transition between simulators. Students' preference for the progressive practice schedule indicates that educators should consider this sequence for simulation-based training.

  6. Protective effect of total flavonoids of seabuckthorn (Hippophae rhamnoides) in simulated high-altitude polycythemia in rats.

    PubMed

    Zhou, Ji-Yin; Zhou, Shi-Wen; Du, Xiao-Huang; Zeng, Sheng-Ya

    2012-09-28

    Seabuckthorn (Hippophae rhamnoides L.) has been used to treat high altitude diseases. The effects of five-week treatment with total flavonoids of seabuckthorn (35, 70, 140 mg/kg, ig) on cobalt chloride (5.5 mg/kg, ip)- and hypobaric chamber (simulating 5,000 m)-induced high-altitude polycythemia in rats were measured. Total flavonoids decreased red blood cell number, hemoglobin, hematocrit, mean corpuscular hemoglobin levels, span of red blood cell electrophoretic mobility, aggregation index of red blood cell, plasma viscosity, whole blood viscosity, and increased deformation index of red blood cell, erythropoietin level in serum. Total flavonoids increased pH, pO₂, Sp(O₂), pCO₂ levels in arterial blood, and increased Na⁺, HCO₃⁻, Cl⁻, but decreased K⁺ concentrations. Total flavonoids increased mean arterial pressure, left ventricular systolic pressure, end-diastolic pressure, maximal rate of rise and decrease, decreased heart rate and protected right ventricle morphology. Changes in hemodynamic, hematologic parameters, and erythropoietin content suggest that administration of total flavonoids from seabuckthorn may be useful in the prevention of high altitude polycythaemia in rats.

  7. Evaluation of Environmentally Assisted Cracking of Armour Wires in Flexible Pipes, Power Cables and Umbilicals

    NASA Astrophysics Data System (ADS)

    Zhang, Zhiying

    Environmentally assisted cracking (EAC) of armour wires in flexible pipes, power cables and umbilicals is a major concern with the development of oil and gas fields and wind farms in harsh environments. Hydrogen induced cracking (HIC) or hydrogen embrittlement (HE) of steel armour wires used in deep-water and ultra-deep-water has been evaluated. Simulated tests have been carried out in simulated sea water, under conditions where the susceptibility is the highest, i.e. at room temperature, at the maximum negative cathodic potential and at the maximum stress level expected in service for 150 hours. Examinations of the tested specimens have not revealed cracking or blistering, and measurement of hydrogen content has confirmed hydrogen charging. In addition, sulphide stress cracking (SSC) and chloride stress cracking (CSC) of nickel-based alloy armour wires used in harsh down-hole environments has been evaluated. Simulated tests have been carried out in simulated solution containing high concentration of chloride, with high hydrogen sulphide partial pressure, at high stress level and at 120 °C for 720 hours. Examinations of the tested specimens have not revealed cracking or blistering. Subsequent tensile tests of the tested specimens at ambient pressure and temperature have revealed properties similar to the as-received specimens.

  8. Man-systems evaluation of moving base vehicle simulation motion cues. [human acceleration perception involving visual feedback

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, M.; Brye, R. G.

    1974-01-01

    A motion cue investigation program is reported that deals with human factor aspects of high fidelity vehicle simulation. General data on non-visual motion thresholds and specific threshold values are established for use as washout parameters in vehicle simulation. A general purpose similator is used to test the contradictory cue hypothesis that acceleration sensitivity is reduced during a vehicle control task involving visual feedback. The simulator provides varying acceleration levels. The method of forced choice is based on the theory of signal detect ability.

  9. Multispectral simulation environment for modeling low-light-level sensor systems

    NASA Astrophysics Data System (ADS)

    Ientilucci, Emmett J.; Brown, Scott D.; Schott, John R.; Raqueno, Rolando V.

    1998-11-01

    Image intensifying cameras have been found to be extremely useful in low-light-level (LLL) scenarios including military night vision and civilian rescue operations. These sensors utilize the available visible region photons and an amplification process to produce high contrast imagery. It has been demonstrated that processing techniques can further enhance the quality of this imagery. For example, fusion with matching thermal IR imagery can improve image content when very little visible region contrast is available. To aid in the improvement of current algorithms and the development of new ones, a high fidelity simulation environment capable of producing radiometrically correct multi-band imagery for low- light-level conditions is desired. This paper describes a modeling environment attempting to meet these criteria by addressing the task as two individual components: (1) prediction of a low-light-level radiance field from an arbitrary scene, and (2) simulation of the output from a low- light-level sensor for a given radiance field. The radiance prediction engine utilized in this environment is the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model which is a first principles based multi-spectral synthetic image generation model capable of producing an arbitrary number of bands in the 0.28 to 20 micrometer region. The DIRSIG model is utilized to produce high spatial and spectral resolution radiance field images. These images are then processed by a user configurable multi-stage low-light-level sensor model that applies the appropriate noise and modulation transfer function (MTF) at each stage in the image processing chain. This includes the ability to reproduce common intensifying sensor artifacts such as saturation and 'blooming.' Additionally, co-registered imagery in other spectral bands may be simultaneously generated for testing fusion and exploitation algorithms. This paper discusses specific aspects of the DIRSIG radiance prediction for low- light-level conditions including the incorporation of natural and man-made sources which emphasizes the importance of accurate BRDF. A description of the implementation of each stage in the image processing and capture chain for the LLL model is also presented. Finally, simulated images are presented and qualitatively compared to lab acquired imagery from a commercial system.

  10. The impact of texting on driver behaviour at rail level crossings.

    PubMed

    Young, Kristie L; Lenné, Michael G; Salmon, Paul M; Stanton, Neville A

    2018-05-21

    A driver text messaging in the vicinity of a rail level crossing represents the merging of a high-risk, high-workload driving environment with a highly distracting secondary task. In this simulator study, we examined how texting impacts driver behaviour on approach to actively controlled urban rail level crossings. Twenty-eight participants drove a series of simulated urban routes containing rail level crossings, while sending text messages and while driving without performing a secondary task. At half of the crossings, drivers were required to respond to the crossing warnings as a train approached. Results revealed that texting on approach to rail level crossings had a detrimental impact on a range of driver behaviour measures. Specifically, texting more than doubled the amount of time spent with eyes off the forward roadway, resulting in drivers spending more than half of their approach time to rail level crossings looking away from the road. This lack of visual attention to the roadway was associated with a range of decrements in driving that may be indicative of a loss of situation awareness, including increased brake reaction time to the crossing warnings and a reduction in lateral position control. The findings have safety implications, not only for urban level crossings, but also for passive level crossings where no warnings are present to re-orient the distracted driver's attention toward an approaching train. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Visualization of the Eastern Renewable Generation Integration Study: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny; Novacheck, Joshua; Bloom, Aaron

    The Eastern Renewable Generation Integration Study (ERGIS), explores the operational impacts of the wide spread adoption of wind and solar photovoltaics (PV) resources in the U.S. Eastern Interconnection and Quebec Interconnection (collectively, EI). In order to understand some of the economic and reliability challenges of managing hundreds of gigawatts of wind and PV generation, we developed state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NREL's high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated withmore » evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions. state of the art tools, data, and models for simulating power system operations using hourly unit commitment and 5-minute economic dispatch over an entire year. Using NRELs high-performance computing capabilities and new methodologies to model operations, we found that the EI, as simulated with evolutionary change in 2026, could balance the variability and uncertainty of wind and PV at a 5-minute level under a variety of conditions. A large-scale display and a combination of multiple coordinated views and small multiples were used to visually analyze the four large highly multivariate scenarios with high spatial and temporal resolutions.« less

  12. Teamwork skills in actual, in situ, and in-center pediatric emergencies: performance levels across settings and perceptions of comparative educational impact.

    PubMed

    Couto, Thomaz Bittencourt; Kerrey, Benjamin T; Taylor, Regina G; FitzGerald, Michael; Geis, Gary L

    2015-04-01

    Pediatric emergencies require effective teamwork. These skills are developed and demonstrated in actual emergencies and in simulated environments, including simulation centers (in center) and the real care environment (in situ). Our aims were to compare teamwork performance across these settings and to identify perceived educational strengths and weaknesses between simulated settings. We hypothesized that teamwork performance in actual emergencies and in situ simulations would be higher than for in-center simulations. A retrospective, video-based assessment of teamwork was performed in an academic, pediatric level 1 trauma center, using the Team Emergency Assessment Measure (TEAM) tool (range, 0-44) among emergency department providers (physicians, nurses, respiratory therapists, paramedics, patient care assistants, and pharmacists). A survey-based, cross-sectional assessment was conducted to determine provider perceptions regarding simulation training. One hundred thirty-two videos, 44 from each setting, were reviewed. Mean total TEAM scores were similar and high in all settings (31.2 actual, 31.1 in situ, and 32.3 in-center, P = 0.39). Of 236 providers, 154 (65%) responded to the survey. For teamwork training, in situ simulation was considered more realistic (59% vs. 10%) and more effective (45% vs. 15%) than in-center simulation. In a video-based study in an academic pediatric institution, ratings of teamwork were relatively high among actual resuscitations and 2 simulation settings, substantiating the influence of simulation-based training on instilling a culture of communication and teamwork. On the basis of survey results, providers favored the in situ setting for teamwork training and suggested an expansion of our existing in situ program.

  13. Kalman approach to accuracy management for interoperable heterogeneous model abstraction within an HLA-compliant simulation

    NASA Astrophysics Data System (ADS)

    Leskiw, Donald M.; Zhau, Junmei

    2000-06-01

    This paper reports on results from an ongoing project to develop methodologies for representing and managing multiple, concurrent levels of detail and enabling high performance computing using parallel arrays within distributed object-based simulation frameworks. At this time we present the methodology for representing and managing multiple, concurrent levels of detail and modeling accuracy by using a representation based on the Kalman approach for estimation. The Kalman System Model equations are used to represent model accuracy, Kalman Measurement Model equations provide transformations between heterogeneous levels of detail, and interoperability among disparate abstractions is provided using a form of the Kalman Update equations.

  14. Technology and equipment based on induction melters with ``cold'' crucible for reprocessing active metal waste

    NASA Astrophysics Data System (ADS)

    Pastushkov, V. G.; Molchanov, A. V.; Serebryakov, V. P.; Smelova, T. V.; Shestoperov, I. N.

    2000-07-01

    The paper discusses specific features of technology, equipment and control of a single stage RAMW decontamination and melting process in an induction furnace equipped with a "cold" crucible. The calculated and experimental data are given on melting high activity level stainless steel and Zr simulating high activity level metal waste. The work is under way in SSC RF VNIINM.

  15. System-Level Reuse of Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Hazen, Michael R.; Williams, Joseph C.

    2004-01-01

    One of the best ways to enhance space systems simulation fidelity is to leverage off of (reuse) existing high-fidelity simulations. But what happens when the model you would like to reuse is in a different coding language or other barriers arise that make one want to just start over with a clean sheet of paper? Three diverse system-level simulation reuse case studies are described based on experience to date in the development of NASA's Space Station Training Facility (SSTF) at the Johnson Space Center in Houston, Texas. Case studies include (a) the Boeing/Rocketdyne-provided Electrical Power Simulation (EPSIM), (b) the NASA Automation and Robotics Division-provided TRICK robotics systems model, and (c) the Russian Space Agency- provided Russian Segment Trainer. In each case, there was an initial tendency to dismiss simulation reuse candidates based on an apparent lack of suitability. A more careful examination based on a more structured assessment of architectural and requirements-oriented representations of the reuse candidates revealed significant reuse potential. Specific steps used to conduct the detailed assessments are discussed. The steps include the following: 1) Identifying reuse candidates; 2) Requirements compatibility assessment; 3) Maturity assessment; 4) Life-cycle cost determination; and 5) Risk assessment. Observations and conclusions are presented related to the real cost of system-level simulation component reuse. Finally, lessons learned that relate to maximizing the benefits of space systems simulation reuse are shared. These concepts should be directly applicable for use in the development of space systems simulations in the future.

  16. Development of the GOSAT-2 FTS-2 Simulator and Preliminary Sensitivity Analysis for CO2 Retrieval

    NASA Astrophysics Data System (ADS)

    Kamei, A.; Yoshida, Y.; Dupuy, E.; Hiraki, K.; Yokota, Y.; Oishi, Y.; Murakami, K.; Morino, I.; Matsunaga, T.

    2013-12-01

    The Greenhouse Gases Observing Satellite-2 (GOSAT-2), which is a successor mission to the GOSAT, is planned to be launched in FY 2017. The Fourier Transform Spectrometer-2 (FTS-2) onboard the GOSAT-2 is a primary sensor to observe infrared light reflected and emitted from the Earth's surface and atmosphere. The FTS-2 obtains high-spectral resolution spectra with four bands from near to short-wavelength infrared (SWIR) region and one band in the thermal infrared (TIR) region. The column amounts of carbon dioxide (CO2) and methane (CH4) are retrieved from the obtained radiance spectra with SWIR bands. Compared to the FTS onboard the GOSAT, the FTS-2 includes an additional SWIR band to allow for carbon monoxide (CO) measurement. We have been developing a tool, named GOSAT-2 FTS-2 simulator, which is capable of simulating the spectral radiance data observed by the FTS-2 using the Pstar2 radiative transfer code. The purpose of the GOSAT-2 FTS-2 simulator is to obtain data which is exploited in the sensor specification, the optimization of parameters for Level 1 processing, and the improvement of Level 2 algorithms. The GOSAT-2 FTS-2 simulator, composed of the six components: 1) Overall control, 2) Onboarding platform, 3) Spectral radiance calculation, 4) Fourier transform, 5) L1B processing, and 6) L1B data output, has been installed on the GOSAT Research Computation Facility (GOSAT RCF), which is a large-scale, high-performance, and energy-efficient computer. We present the progress in the development of the GOSAT-2 FTS-2 simulator and the preliminary sensitivity analysis, relating to the engineering parameters, the aerosols and clouds, and so on, on the Level 1 processing for CO2 retrieval from the obtained data by simulating the FTS-2 SWIR observation using the GOSAT-2 FTS-2 simulator.

  17. A survey of electric and hybrid vehicle simulation programs

    NASA Technical Reports Server (NTRS)

    Bevan, J.; Heimburger, D. A.; Metcalfe, M. A.

    1978-01-01

    Results of a survey conducted within the United States to determine the extent of development and capabilities of automotive performance simulation programs suitable for electric and hybrid vehicle studies are summarized. Altogether, 111 programs were identified as being in a usable state. The complexity of the existing programs spans a range from a page of simple desktop calculator instructions to 300,000 lines of a high-level programming language. The capability to simulate electric vehicles was most common, heat-engines second, and hybrid vehicles least common. Batch-operated programs are slightly more common than interactive ones, and one-third can be operated in either mode. The most commonly used language was FORTRAN, the language typically used by engineers. The higher-level simulation languages (e.g. SIMSCRIPT, GPSS, SIMULA) used by "model builders" were conspicuously lacking.

  18. Situation Awareness and Levels of Automation

    NASA Technical Reports Server (NTRS)

    Kaber, David B.

    1999-01-01

    During the first year of this project, a taxonomy of theoretical levels of automation (LOAs) was applied to the advanced commercial aircraft by categorizing actual modes of McDonald Douglas MD-11 autoflight system operation in terms of the taxonomy. As well, high LOAs included in the taxonomy (e.g., supervisory control) were modeled in the context of MD-11 autoflight systems through development of a virtual flight simulator. The flight simulator was an integration of a re-configurable simulator developed by the Georgia Institute Technology and new software prototypes of autoflight system modules found in the MD-11 cockpit. In addition to this work, a version of the Situation Awareness Global Assessment Technique (SAGAT) was developed for application to commercial piloting tasks. A software package was developed to deliver the SAGAT and was integrated with the virtual flight simulator.

  19. Extracting and shaping the light of OLED devices

    NASA Astrophysics Data System (ADS)

    Riedel, Daniel; Dlugosch, Julian; Wehlus, Thomas; Brabec, Christoph

    2015-09-01

    Before the market entry of organic light emitting diodes (OLEDs) into the field of general illumination can occur, limitations in lifetime, luminous efficacy and cost must be overcome. Additional requirements for OLEDs used for general illumination may be imposed by workplace glare reduction requirements, which demand limited luminance for high viewing angles. These requirements contrast with the typical lambertian emission characteristics of OLEDs, which result in the same luminance levels for all emission angles. As a consequence, without additional measures glare reduction could limit the maximum possible luminance of lambertian OLEDs to relatively low levels. However, high luminance levels are still desirable in order to obtain high light output. We are presenting solutions to overcome this dilemma. Therefore this work is focused on light-shaping structures for OLEDs with an internal light extraction layer. Simulations of beam-shaping structures and shapes are presented, followed by experimental measurements to verify the simulations of the most promising structures. An investigation of the loss channels has been carried out and the overall optical system efficiency was evaluated for all structures. The most promising light shaping structures achieve system efficiencies up to 80%. Finally, a general illumination application scenario has been simulated. The number of OLEDs needed to illuminate an office room has been deduced from this scenario. By using light-shaping structures for OLEDs, the number of OLEDs needed to reach the mandatory illuminance level for a workplace environment can be reduced to one third compared to lambertian OLEDs.

  20. A review of numerical simulation of hydrothermal systems.

    USGS Publications Warehouse

    Mercer, J.W.; Faust, C.R.

    1979-01-01

    Many advances in simulating single and two-phase fluid flow and heat transport in porous media have recently been made in conjunction with geothermal energy research. These numerical models reproduce system thermal and pressure behaviour and can be used for other heat-transport problems, such as high-level radioactive waste disposal and heat-storage projects. -Authors

  1. Simulations of material mixing in laser-driven reshock experiments

    NASA Astrophysics Data System (ADS)

    Haines, Brian M.; Grinstein, Fernando F.; Welser-Sherrill, Leslie; Fincke, James R.

    2013-02-01

    We perform simulations of a laser-driven reshock experiment [Welser-Sherrill et al., High Energy Density Phys. (unpublished)] in the strong-shock high energy-density regime to better understand material mixing driven by the Richtmyer-Meshkov instability. Validation of the simulations is based on direct comparison of simulation and radiographic data. Simulations are also compared with published direct numerical simulation and the theory of homogeneous isotropic turbulence. Despite the fact that the flow is neither homogeneous, isotropic nor fully turbulent, there are local regions in which the flow demonstrates characteristics of homogeneous isotropic turbulence. We identify and isolate these regions by the presence of high levels of turbulent kinetic energy (TKE) and vorticity. After reshock, our analysis shows characteristics consistent with those of incompressible isotropic turbulence. Self-similarity and effective Reynolds number assessments suggest that the results are reasonably converged at the finest resolution. Our results show that in shock-driven transitional flows, turbulent features such as self-similarity and isotropy only fully develop once de-correlation, characteristic vorticity distributions, and integrated TKE, have decayed significantly. Finally, we use three-dimensional simulation results to test the performance of two-dimensional Reynolds-averaged Navier-Stokes simulations. In this context, we also test a presumed probability density function turbulent mixing model extensively used in combustion applications.

  2. Evaluation of the capabilities of satellite imagery for monitoring regional air pollution episodes

    NASA Technical Reports Server (NTRS)

    Barnes, J. C.; Bowley, C. J.; Burke, H. H. K.

    1979-01-01

    A comparative analysis of satellite visible channel imagery and ground based aerosol measurements is carried out for three cases representing a significant pollution episodes based on low surface visibility and high sulfate levels. The feasibility of detecting pollution episodes from space is also investigated using a simulation model. The model results are compared to quantitative information derived from digitized satellite data. The results show that when levels are or = 30 micrograms/cu, a haze pattern that correlates closely with the area of reported low surface visibilities and high micrograms sulfate levels can be detected in satellite visible channel imagery. The model simulation demonstrates the potential of the satellite to monitor the magnitude and areal extent of pollution episodes. Quantitative information on total aerosol amount derived from the satellite digitized data using the atmospheric radiative transfer model agrees well with the results obtained from the ground based measurements.

  3. Computer simulation of a single pilot flying a modern high-performance helicopter

    NASA Technical Reports Server (NTRS)

    Zipf, Mark E.; Vogt, William G.; Mickle, Marlin H.; Hoelzeman, Ronald G.; Kai, Fei; Mihaloew, James R.

    1988-01-01

    Presented is a computer simulation of a human response pilot model able to execute operational flight maneuvers and vehicle stabilization of a modern high-performance helicopter. Low-order, single-variable, human response mechanisms, integrated to form a multivariable pilot structure, provide a comprehensive operational control over the vehicle. Evaluations of the integrated pilot were performed by direct insertion into a nonlinear, total-force simulation environment provided by NASA Lewis. Comparisons between the integrated pilot structure and single-variable pilot mechanisms are presented. Static and dynamically alterable configurations of the pilot structure are introduced to simulate pilot activities during vehicle maneuvers. These configurations, in conjunction with higher level, decision-making processes, are considered for use where guidance and navigational procedures, operational mode transfers, and resource sharing are required.

  4. The Time Dependent Propensity Function for Acceleration of Spatial Stochastic Simulation of Reaction-Diffusion Systems

    PubMed Central

    Wu, Sheng; Li, Hong; Petzold, Linda R.

    2015-01-01

    The inhomogeneous stochastic simulation algorithm (ISSA) is a fundamental method for spatial stochastic simulation. However, when diffusion events occur more frequently than reaction events, simulating the diffusion events by ISSA is quite costly. To reduce this cost, we propose to use the time dependent propensity function in each step. In this way we can avoid simulating individual diffusion events, and use the time interval between two adjacent reaction events as the simulation stepsize. We demonstrate that the new algorithm can achieve orders of magnitude efficiency gains over widely-used exact algorithms, scales well with increasing grid resolution, and maintains a high level of accuracy. PMID:26609185

  5. Computer-Aided Design and 3-Dimensional Printing for Costal Cartilage Simulation of Airway Graft Carving.

    PubMed

    Ha, Jennifer F; Morrison, Robert J; Green, Glenn E; Zopf, David A

    2017-06-01

    Autologous cartilage grafting during open airway reconstruction is a complex skill instrumental to the success of the operation. Most trainees lack adequate opportunities to develop proficiency in this skill. We hypothesized that 3-dimensional (3D) printing and computer-aided design can be used to create a high-fidelity simulator for developing skills carving costal cartilage grafts for airway reconstruction. The rapid manufacturing and low cost of the simulator allow deployment in locations lacking expert instructors or cadaveric dissection, such as medical missions and Third World countries. In this blinded, prospective observational study, resident trainees completed a physical simulator exercise using a 3D-printed costal cartilage grafting tool. Participant assessment was performed using a Likert scale questionnaire, and airway grafts were assessed by a blinded expert surgeon. Most participants found this to be a very relevant training tool and highly rated the level of realism of the simulation tool.

  6. Computational Models of Protein Kinematics and Dynamics: Beyond Simulation

    PubMed Central

    Gipson, Bryant; Hsu, David; Kavraki, Lydia E.; Latombe, Jean-Claude

    2016-01-01

    Physics-based simulation represents a powerful method for investigating the time-varying behavior of dynamic protein systems at high spatial and temporal resolution. Such simulations, however, can be prohibitively difficult or lengthy for large proteins or when probing the lower-resolution, long-timescale behaviors of proteins generally. Importantly, not all questions about a protein system require full space and time resolution to produce an informative answer. For instance, by avoiding the simulation of uncorrelated, high-frequency atomic movements, a larger, domain-level picture of protein dynamics can be revealed. The purpose of this review is to highlight the growing body of complementary work that goes beyond simulation. In particular, this review focuses on methods that address kinematics and dynamics, as well as those that address larger organizational questions and can quickly yield useful information about the long-timescale behavior of a protein. PMID:22524225

  7. Study of SF6 gas decomposition products based on spectroscopy technology

    NASA Astrophysics Data System (ADS)

    Cai, Ji-xing; Na, Yan-xiang; Ni, Wei-yuan; Li, Guo-wei; Feng, Ke-cheng; Song, Gui-cai

    2011-08-01

    With the rapid development of power industry, the number of SF6 electrical equipment are increasing, it has gradually replaced the traditional insulating oil material as insulation and arc media in the high-voltage electrical equipment. Pure SF6 gas has excellent insulating properties and arc characteristics; however, under the effect of the strong arc, SF6 gas will decompose and generate toxic substances, then corroding electrical equipment, thereby affecting the insulation and arc ability of electrical equipment. If excessive levels of impurities in the gas that will seriously affect the mechanical properties, breaking performance and electrical performance of electrical equipment, it will cause many serious consequences, even threaten the safe operation of the grid. This paper main analyzes the basic properties of SF6 gas and the basic situation of decomposition in the discharge conditions, in order to simulate the actual high-voltage electrical equipment, designed and produced a simulation device that can simulate the decomposition of SF6 gas under a high voltage discharge, and using fourier transform infrared spectroscopy to analyze the sample that produced by the simulation device. The result show that the main discharge decomposition product is SO2F2 (sulfuryl fluoride), the substance can react with water and generate corrosive H2SO4(sulfuric acid) and HF (hydrogen fluoride), also found that the increase in the number with the discharge, SO2F2concentration levels are on the rise. Therefore, the material can be used as one of the main characteristic gases to determine the SF6 electrical equipment failure, and to monitor their concentration levels.

  8. Aerial release of bacteria from cot mattress materials and the sudden infant death syndrome.

    PubMed

    Sherburn, R E; Jenkins, R O

    2005-01-01

    To investigate aerial release of bacteria from used cot mattresses and to assess factors that may influence this process. Movement on used mattresses, simulating that of an infant's head, significantly enhanced aerial release of naturally acquired bacteria from the polyurethane foams (total count data, P = 0.008; Staphylococcus aureus, P = 0.004) or from polyvinyl chloride covers (total count data, P = 0.001). Aerial release of naturally acquired bacteria from used cot mattresses showed high variability and was poorly correlated (R2 < or = 0.294) with bacterial cell density within the materials. In experiments involving inoculation of S. aureus and Escherichia coli onto the polyurethane of unused cot mattresses, aerial release of the species correlated well (R2 > or = 0.950) with inoculation density when simulated infant head movement was applied. Aerial release of these bacterial species from the material decreased with increase in width or aqueous content of the material, and was lower from polyurethane foam of a used cot mattress. Simulated infant movement and mattress related factors influence aerial release of bacteria from cot mattress materials. With simulated infant movement on cot mattress polyurethane foam, levels of airborne bacteria above the material are proportional to bacterial population levels inoculated onto the material. Cot mattresses harbouring relatively high levels of naturally acquired toxigenic bacteria, such as S. aureus, could pose a relatively high risk of infection to the infant's respiratory tract through increased aerial contamination. This has impact in the context of recent findings on cot mattress related risk factors for sudden infant death syndrome.

  9. Probabilistic Downscaling of Remote Sensing Data with Applications for Multi-Scale Biogeochemical Flux Modeling.

    PubMed

    Stoy, Paul C; Quaife, Tristan

    2015-01-01

    Upscaling ecological information to larger scales in space and downscaling remote sensing observations or model simulations to finer scales remain grand challenges in Earth system science. Downscaling often involves inferring subgrid information from coarse-scale data, and such ill-posed problems are classically addressed using regularization. Here, we apply two-dimensional Tikhonov Regularization (2DTR) to simulate subgrid surface patterns for ecological applications. Specifically, we test the ability of 2DTR to simulate the spatial statistics of high-resolution (4 m) remote sensing observations of the normalized difference vegetation index (NDVI) in a tundra landscape. We find that the 2DTR approach as applied here can capture the major mode of spatial variability of the high-resolution information, but not multiple modes of spatial variability, and that the Lagrange multiplier (γ) used to impose the condition of smoothness across space is related to the range of the experimental semivariogram. We used observed and 2DTR-simulated maps of NDVI to estimate landscape-level leaf area index (LAI) and gross primary productivity (GPP). NDVI maps simulated using a γ value that approximates the range of observed NDVI result in a landscape-level GPP estimate that differs by ca 2% from those created using observed NDVI. Following findings that GPP per unit LAI is lower near vegetation patch edges, we simulated vegetation patch edges using multiple approaches and found that simulated GPP declined by up to 12% as a result. 2DTR can generate random landscapes rapidly and can be applied to disaggregate ecological information and compare of spatial observations against simulated landscapes.

  10. Probabilistic Downscaling of Remote Sensing Data with Applications for Multi-Scale Biogeochemical Flux Modeling

    PubMed Central

    Stoy, Paul C.; Quaife, Tristan

    2015-01-01

    Upscaling ecological information to larger scales in space and downscaling remote sensing observations or model simulations to finer scales remain grand challenges in Earth system science. Downscaling often involves inferring subgrid information from coarse-scale data, and such ill-posed problems are classically addressed using regularization. Here, we apply two-dimensional Tikhonov Regularization (2DTR) to simulate subgrid surface patterns for ecological applications. Specifically, we test the ability of 2DTR to simulate the spatial statistics of high-resolution (4 m) remote sensing observations of the normalized difference vegetation index (NDVI) in a tundra landscape. We find that the 2DTR approach as applied here can capture the major mode of spatial variability of the high-resolution information, but not multiple modes of spatial variability, and that the Lagrange multiplier (γ) used to impose the condition of smoothness across space is related to the range of the experimental semivariogram. We used observed and 2DTR-simulated maps of NDVI to estimate landscape-level leaf area index (LAI) and gross primary productivity (GPP). NDVI maps simulated using a γ value that approximates the range of observed NDVI result in a landscape-level GPP estimate that differs by ca 2% from those created using observed NDVI. Following findings that GPP per unit LAI is lower near vegetation patch edges, we simulated vegetation patch edges using multiple approaches and found that simulated GPP declined by up to 12% as a result. 2DTR can generate random landscapes rapidly and can be applied to disaggregate ecological information and compare of spatial observations against simulated landscapes. PMID:26067835

  11. Comparisons of pilot performance in simulated and actual flight. [effects of ingested barbiturates

    NASA Technical Reports Server (NTRS)

    Billings, C. E.; Gerke, R. J.; Wick, R. L., Jr.

    1975-01-01

    Five highly experienced professional pilots performed instrument landing system approaches under simulated instrument flight conditions in a Cessna 172 airplane and in a Link-Singer GAT-1 simulator while under the influence of orally administered secobarbital (0, 100, and 200 mg). Tracking performance in two axes and airspeed control were evaluated continuously during each approach. Error and RMS variability were about half as large in the simulator as in the airplane. The observed data were more strongly associated with the drug level in the simulator than in the airplane. Further, the drug-related effects were more consistent in the simulator. Improvement in performance suggestive of learning effects were seen in the simulator, but not in actual flight.

  12. Modelling the Evolution of Social Structure

    PubMed Central

    Sutcliffe, A. G.; Dunbar, R. I. M.; Wang, D.

    2016-01-01

    Although simple social structures are more common in animal societies, some taxa (mainly mammals) have complex, multi-level social systems, in which the levels reflect differential association. We develop a simulation model to explore the conditions under which multi-level social systems of this kind evolve. Our model focuses on the evolutionary trade-offs between foraging and social interaction, and explores the impact of alternative strategies for distributing social interaction, with fitness criteria for wellbeing, alliance formation, risk, stress and access to food resources that reward social strategies differentially. The results suggest that multi-level social structures characterised by a few strong relationships, more medium ties and large numbers of weak ties emerge only in a small part of the overall fitness landscape, namely where there are significant fitness benefits from wellbeing and alliance formation and there are high levels of social interaction. In contrast, ‘favour-the-few’ strategies are more competitive under a wide range of fitness conditions, including those producing homogeneous, single-level societies of the kind found in many birds and mammals. The simulations suggest that the development of complex, multi-level social structures of the kind found in many primates (including humans) depends on a capacity for high investment in social time, preferential social interaction strategies, high mortality risk and/or differential reproduction. These conditions are characteristic of only a few mammalian taxa. PMID:27427758

  13. Robust High-Resolution Cloth Using Parallelism, History-Based Collisions and Accurate Friction

    PubMed Central

    Selle, Andrew; Su, Jonathan; Irving, Geoffrey; Fedkiw, Ronald

    2015-01-01

    In this paper we simulate high resolution cloth consisting of up to 2 million triangles which allows us to achieve highly detailed folds and wrinkles. Since the level of detail is also influenced by object collision and self collision, we propose a more accurate model for cloth-object friction. We also propose a robust history-based repulsion/collision framework where repulsions are treated accurately and efficiently on a per time step basis. Distributed memory parallelism is used for both time evolution and collisions and we specifically address Gauss-Seidel ordering of repulsion/collision response. This algorithm is demonstrated by several high-resolution and high-fidelity simulations. PMID:19147895

  14. QM/MM free energy simulations: recent progress and challenges

    PubMed Central

    Lu, Xiya; Fang, Dong; Ito, Shingo; Okamoto, Yuko; Ovchinnikov, Victor

    2016-01-01

    Due to the higher computational cost relative to pure molecular mechanical (MM) simulations, hybrid quantum mechanical/molecular mechanical (QM/MM) free energy simulations particularly require a careful consideration of balancing computational cost and accuracy. Here we review several recent developments in free energy methods most relevant to QM/MM simulations and discuss several topics motivated by these developments using simple but informative examples that involve processes in water. For chemical reactions, we highlight the value of invoking enhanced sampling technique (e.g., replica-exchange) in umbrella sampling calculations and the value of including collective environmental variables (e.g., hydration level) in metadynamics simulations; we also illustrate the sensitivity of string calculations, especially free energy along the path, to various parameters in the computation. Alchemical free energy simulations with a specific thermodynamic cycle are used to probe the effect of including the first solvation shell into the QM region when computing solvation free energies. For cases where high-level QM/MM potential functions are needed, we analyze two different approaches: the QM/MM-MFEP method of Yang and co-workers and perturbative correction to low-level QM/MM free energy results. For the examples analyzed here, both approaches seem productive although care needs to be exercised when analyzing the perturbative corrections. PMID:27563170

  15. Quantifying Turbulent Kinetic Energy in an Aortic Coarctation with Large Eddy Simulation and Magnetic Resonance Imaging

    NASA Astrophysics Data System (ADS)

    Lantz, Jonas; Ebbers, Tino; Karlsson, Matts

    2012-11-01

    In this study, turbulent kinetic energy (TKE) in an aortic coarctation was studied using both a numerical technique (large eddy simulation, LES) and in vivo measurements using magnetic resonance imaging (MRI). High levels of TKE are undesirable, as kinetic energy is extracted from the mean flow to feed the turbulent fluctuations. The patient underwent surgery to widen the coarctation, and the flow before and after surgery was computed and compared to MRI measurements. The resolution of the MRI was about 7 × 7 voxels in axial cross-section while 50x50 mesh cells with increased resolution near the walls was used in the LES simulation. In general, the numerical simulations and MRI measurements showed that the aortic arch had no or very low levels of TKE, while elevated values were found downstream the coarctation. It was also found that TKE levels after surgery were lowered, indicating that the diameter of the constriction was increased enough to decrease turbulence effects. In conclusion, both the numerical simulation and MRI measurements gave very similar results, thereby validating the simulations and suggesting that MRI measured TKE can be used as an initial estimation in clinical practice, while LES results can be used for detailed quantification and further research of aortic flows.

  16. Multiresolution modeling with a JMASS-JWARS HLA Federation

    NASA Astrophysics Data System (ADS)

    Prince, John D.; Painter, Ron D.; Pendell, Brian; Richert, Walt; Wolcott, Christopher

    2002-07-01

    CACI, Inc.-Federal has built, tested, and demonstrated the use of a JMASS-JWARS HLA Federation that supports multi- resolution modeling of a weapon system and its subsystems in a JMASS engineering and engagement model environment, while providing a realistic JWARS theater campaign-level synthetic battle space and operational context to assess the weapon system's value added and deployment/employment supportability in a multi-day, combined force-on-force scenario. Traditionally, acquisition analyses require a hierarchical suite of simulation models to address engineering, engagement, mission and theater/campaign measures of performance, measures of effectiveness and measures of merit. Configuring and running this suite of simulations and transferring the appropriate data between each model is both time consuming and error prone. The ideal solution would be a single simulation with the requisite resolution and fidelity to perform all four levels of acquisition analysis. However, current computer hardware technologies cannot deliver the runtime performance necessary to support the resulting extremely large simulation. One viable alternative is to integrate the current hierarchical suite of simulation models using the DoD's High Level Architecture in order to support multi- resolution modeling. An HLA integration eliminates the extremely large model problem, provides a well-defined and manageable mixed resolution simulation and minimizes VV&A issues.

  17. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    DOE PAGES

    Kirsch, L. E.; Bernstein, L. A.

    2018-03-04

    In this paper, a new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the usemore » of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Finally, several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.« less

  18. RAINIER: A simulation tool for distributions of excited nuclear states and cascade fluctuations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirsch, L. E.; Bernstein, L. A.

    In this paper, a new code has been developed named RAINIER that simulates the γ-ray decay of discrete and quasi-continuum nuclear levels for a user-specified range of energy, angular momentum, and parity including a realistic treatment of level spacing and transition width fluctuations. A similar program, DICEBOX, uses the Monte Carlo method to simulate level and width fluctuations but is restricted in its initial level population algorithm. On the other hand, modern reaction codes such as TALYS and EMPIRE populate a wide range of states in the residual nucleus prior to γ-ray decay, but do not go beyond the usemore » of deterministic functions and therefore neglect cascade fluctuations. This combination of capabilities allows RAINIER to be used to determine quasi-continuum properties through comparison with experimental data. Finally, several examples are given that demonstrate how cascade fluctuations influence experimental high-resolution γ-ray spectra from reactions that populate a wide range of initial states.« less

  19. Taking the measure of a landscape: Comparing a simulated and natural landscape in the Virginia Coastal Plain

    NASA Astrophysics Data System (ADS)

    Howard, Alan D.; Tierney, Heather E.

    2012-01-01

    A landform evolution model is used to investigate the historical evolution of a fluvial landscape along the Potomac River in Virginia, USA. The landscape has developed on three terraces whose ages span 3.5 Ma. The simulation model specifies the temporal evolution of base level control by the river as having a high-frequency component of the response of the Potomac River to sea level fluctuations superimposed on a long-term epeirogenic uplift. The wave-cut benches are assumed to form instantaneously during sea level highstands. The region is underlain by relatively soft coastal plain sediments with high intrinsic erodibility. The survival of portions of these terrace surfaces, up to 3.5 Ma, is attributable to a protective cover of vegetation. The vegetation influence is parameterized as a critical shear stress to fluvial erosion whose magnitude decreases with increasing contributing area. The simulation model replicates the general pattern of dissection of the natural landscape, with decreasing degrees of dissection of the younger terrace surfaces. Channel incision and relief increase in headwater areas are most pronounced during the relatively brief periods of river lowstands. Imposition of the wave-cut terraces onto the simulated landscape triggers a strong incisional response. By qualitative and quantitative measures the model replicates, in a general way, the landform evolution and present morphology of the target region.

  20. Final report on cermet high-level waste forms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobisk, E.H.; Quinby, T.C.; Aaron, W.S.

    1981-08-01

    Cermets are being developed as an alternate method for the fixation of defense and commercial high level radioactive waste in a terminal disposal form. Following initial feasibility assessments of this waste form, consisting of ceramic particles dispersed in an iron-nickel base alloy, significantly improved processing methods were developed. The characterization of cermets has continued through property determinations on samples prepared by various methods from a variety of simulated and actual high-level wastes. This report describes the status of development of the cermet waste form as it has evolved since 1977. 6 tables, 18 figures.

  1. Anatomy assisted PET image reconstruction incorporating multi-resolution joint entropy

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Rahmim, Arman

    2015-01-01

    A promising approach in PET image reconstruction is to incorporate high resolution anatomical information (measured from MR or CT) taking the anato-functional similarity measures such as mutual information or joint entropy (JE) as the prior. These similarity measures only classify voxels based on intensity values, while neglecting structural spatial information. In this work, we developed an anatomy-assisted maximum a posteriori (MAP) reconstruction algorithm wherein the JE measure is supplied by spatial information generated using wavelet multi-resolution analysis. The proposed wavelet-based JE (WJE) MAP algorithm involves calculation of derivatives of the subband JE measures with respect to individual PET image voxel intensities, which we have shown can be computed very similarly to how the inverse wavelet transform is implemented. We performed a simulation study with the BrainWeb phantom creating PET data corresponding to different noise levels. Realistically simulated T1-weighted MR images provided by BrainWeb modeling were applied in the anatomy-assisted reconstruction with the WJE-MAP algorithm and the intensity-only JE-MAP algorithm. Quantitative analysis showed that the WJE-MAP algorithm performed similarly to the JE-MAP algorithm at low noise level in the gray matter (GM) and white matter (WM) regions in terms of noise versus bias tradeoff. When noise increased to medium level in the simulated data, the WJE-MAP algorithm started to surpass the JE-MAP algorithm in the GM region, which is less uniform with smaller isolated structures compared to the WM region. In the high noise level simulation, the WJE-MAP algorithm presented clear improvement over the JE-MAP algorithm in both the GM and WM regions. In addition to the simulation study, we applied the reconstruction algorithms to real patient studies involving DPA-173 PET data and Florbetapir PET data with corresponding T1-MPRAGE MRI images. Compared to the intensity-only JE-MAP algorithm, the WJE-MAP algorithm resulted in comparable regional mean values to those from the maximum likelihood algorithm while reducing noise. Achieving robust performance in various noise-level simulation and patient studies, the WJE-MAP algorithm demonstrates its potential in clinical quantitative PET imaging.

  2. Flow and heat transfer experiments in the turbine airfoil/endwall region

    NASA Astrophysics Data System (ADS)

    Chung, Jin Taek

    An experimental investigation of the three-dimensional flow and heat transfer near the junction between the endwall and suction wall of a gas turbine was performed. A large-scale, two-half-blade facility which simulates a turbine cascade was introduced. The simulator consists of two large half-blade sections, one wall simulating the pressure surface and the other wall simulating the suction surface. The advantage of this configuration is that the features of the secondary flow are large, because of the relatively large test section, and the flow is easily accessible with probes. Qualification of this simulator was by comparison to a multi-blade cascade flow. Various flow visualization techniques--oil and lampblack, ink and oil of wintergeeen, a single tuft probe, and a tuft grid--were employed to confirm that the important features of the cascade flow were replicated in this simulator. The triangular region on the suction surface, which was affected by the passage vortex, and the endwall secondary crossflow were observed by shear stress visualization and the liquid crystal measurement techniques. In order to investigate the effects of the turbulence level on the secondary flow in a turbine passage, a turbulence generator, designed to reproduce the characteristics of a combustor exit flow, was built. The generator was designed not only to generate a high turbulence level but to produce three main features of a combustor exit flow. The generator produced a turbulence intensity level of about 10 percent and an integral length scale of 5 centimeters. It was observed that the endwall secondary flow, including the passage vortex, is not significantly influenced by freestream turbulence levels up to 10 percent. A flow management technique using a boundary layer fence designed to reduce some harmful effects of secondary flow in the endwall region of a turbine passage was introduced. The boundary layer fence is effective in changing the passage of the vortex and reducing the influence of the vortex near the suction wall. The fence was even more effective in reducing secondary flows for high levels of freestream turbulence (approximately 10 percent).

  3. Agent Based Modeling of Atherosclerosis: A Concrete Help in Personalized Treatments

    NASA Astrophysics Data System (ADS)

    Pappalardo, Francesco; Cincotti, Alessandro; Motta, Alfredo; Pennisi, Marzio

    Atherosclerosis, a pathology affecting arterial blood vessels, is one of most common diseases of the developed countries. We present studies on the increased atherosclerosis risk using an agent based model of atherogenesis that has been previously validated using clinical data. It is well known that the major risk in atherosclerosis is the persistent high level of low density lipoprotein (LDL) concentration. However, it is not known if short period of high LDL concentration can cause irreversible damage and if reduction of the LDL concentration (either by life style or drug) can drastically or partially reduce the already acquired risk. We simulated four different clinical situations in a large set of virtual patients (200 per clinical scenario). In the first one the patients lifestyle maintains the concentration of LDL in a no risk range. This is the control case simulation. The second case is represented by patients having high level of LDL with a delay to apply appropriate treatments; The third scenario is characterized by patients with high LDL levels treated with specific drugs like statins. Finally we simulated patients that are characterized by several oxidative events (smoke, sedentary life style, assumption of alcoholic drinks and so on so forth) that effective increase the risk of LDL oxidation. Those preliminary results obviously need to be clinically investigated. It is clear, however, that SimAthero has the power to concretely help medical doctors and clinicians in choosing personalized treatments for the prevention of the atherosclerosis damages.

  4. Summary of efficiency testing of standard and high-capacity high-efficiency particulate air filters subjected to simulated tornado depressurization and explosive shock waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, P.R.; Gregory, W.S.

    1985-04-01

    Pressure transients in nuclear facility air cleaning systems can originate from natural phenomena such as tornadoes or from accident-induced explosive blast waves. This study was concerned with the effective efficiency of high-efficiency particulate air (HEPA) filters during pressure surges resulting from simulated tornado and explosion transients. The primary objective of the study was to examine filter efficiencies at pressure levels below the point of structural failure. Both standard and high-capacity 0.61-m by 0.61-m HEPA filters were evaluated, as were several 0.2-m by 0.2-m HEPA filters. For a particular manufacturer, the material release when subjected to tornado transients is the samemore » (per unit area) for both the 0.2-m by 0.2-m and the 0.61-m by 0.61-m filters. For tornado transients, the material release was on the order of micrograms per square meter. When subjecting clean HEPA filters to simulated tornado transients with aerosol entrained in the pressure pulse, all filters tested showed a degradation of filter efficiency. For explosive transients, the material release from preloaded high-capacity filters was as much as 340 g. When preloaded high-capacity filters were subjected to shock waves approximately 50% of the structural limit level, 1 to 2 mg of particulate was released.« less

  5. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  6. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    NASA Astrophysics Data System (ADS)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  7. Childbearing impeded education more than education impeded childbearing among Norwegian women.

    PubMed

    Cohen, Joel E; Kravdal, Øystein; Keilman, Nico

    2011-07-19

    In most societies, women at age 39 with higher levels of education have fewer children. To understand this association, we investigated the effects of childbearing on educational attainment and the effects of education on fertility in the 1964 birth cohort of Norwegian women. Using detailed annual data from ages 17 to 39, we estimated the probabilities of an additional birth, a change in educational level, and enrollment in the coming year, conditional on fertility history, educational level, and enrollment history at the beginning of each year. A simple model reproduced a declining gradient of children ever born with increasing educational level at age 39. When a counterfactual simulation assumed no effects of childbearing on educational progression or enrollment (without changing the estimated effects of education on childbearing), the simulated number of children ever born decreased very little with increasing completed educational level, contrary to data. However, when another counterfactual simulation assumed no effects of current educational level and enrollment on childbearing (without changing the estimated effects of childbearing on education), the simulated number of children ever born decreased with increasing completed educational level nearly as much as the decrease in the data. In summary, in these Norwegian data, childbearing impeded education much more than education impeded childbearing. These results suggest that women with advanced degrees have lower completed fertility on the average principally because women who have one or more children early are more likely to leave or not enter long educational tracks and never attain a high educational level.

  8. Childbearing impeded education more than education impeded childbearing among Norwegian women

    PubMed Central

    Cohen, Joel E.; Kravdal, Øystein; Keilman, Nico

    2011-01-01

    In most societies, women at age 39 with higher levels of education have fewer children. To understand this association, we investigated the effects of childbearing on educational attainment and the effects of education on fertility in the 1964 birth cohort of Norwegian women. Using detailed annual data from ages 17 to 39, we estimated the probabilities of an additional birth, a change in educational level, and enrollment in the coming year, conditional on fertility history, educational level, and enrollment history at the beginning of each year. A simple model reproduced a declining gradient of children ever born with increasing educational level at age 39. When a counterfactual simulation assumed no effects of childbearing on educational progression or enrollment (without changing the estimated effects of education on childbearing), the simulated number of children ever born decreased very little with increasing completed educational level, contrary to data. However, when another counterfactual simulation assumed no effects of current educational level and enrollment on childbearing (without changing the estimated effects of childbearing on education), the simulated number of children ever born decreased with increasing completed educational level nearly as much as the decrease in the data. In summary, in these Norwegian data, childbearing impeded education much more than education impeded childbearing. These results suggest that women with advanced degrees have lower completed fertility on the average principally because women who have one or more children early are more likely to leave or not enter long educational tracks and never attain a high educational level. PMID:21730138

  9. A Taxonomy of Delivery and Documentation Deviations During Delivery of High-Fidelity Simulations.

    PubMed

    McIvor, William R; Banerjee, Arna; Boulet, John R; Bekhuis, Tanja; Tseytlin, Eugene; Torsher, Laurence; DeMaria, Samuel; Rask, John P; Shotwell, Matthew S; Burden, Amanda; Cooper, Jeffrey B; Gaba, David M; Levine, Adam; Park, Christine; Sinz, Elizabeth; Steadman, Randolph H; Weinger, Matthew B

    2017-02-01

    We developed a taxonomy of simulation delivery and documentation deviations noted during a multicenter, high-fidelity simulation trial that was conducted to assess practicing physicians' performance. Eight simulation centers sought to implement standardized scenarios over 2 years. Rules, guidelines, and detailed scenario scripts were established to facilitate reproducible scenario delivery; however, pilot trials revealed deviations from those rubrics. A taxonomy with hierarchically arranged terms that define a lack of standardization of simulation scenario delivery was then created to aid educators and researchers in assessing and describing their ability to reproducibly conduct simulations. Thirty-six types of delivery or documentation deviations were identified from the scenario scripts and study rules. Using a Delphi technique and open card sorting, simulation experts formulated a taxonomy of high-fidelity simulation execution and documentation deviations. The taxonomy was iteratively refined and then tested by 2 investigators not involved with its development. The taxonomy has 2 main classes, simulation center deviation and participant deviation, which are further subdivided into as many as 6 subclasses. Inter-rater classification agreement using the taxonomy was 74% or greater for each of the 7 levels of its hierarchy. Cohen kappa calculations confirmed substantial agreement beyond that expected by chance. All deviations were classified within the taxonomy. This is a useful taxonomy that standardizes terms for simulation delivery and documentation deviations, facilitates quality assurance in scenario delivery, and enables quantification of the impact of deviations upon simulation-based performance assessment.

  10. Efficiently Scheduling Multi-core Guest Virtual Machines on Multi-core Hosts in Network Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoginath, Srikanth B; Perumalla, Kalyan S

    2011-01-01

    Virtual machine (VM)-based simulation is a method used by network simulators to incorporate realistic application behaviors by executing actual VMs as high-fidelity surrogates for simulated end-hosts. A critical requirement in such a method is the simulation time-ordered scheduling and execution of the VMs. Prior approaches such as time dilation are less efficient due to the high degree of multiplexing possible when multiple multi-core VMs are simulated on multi-core host systems. We present a new simulation time-ordered scheduler to efficiently schedule multi-core VMs on multi-core real hosts, with a virtual clock realized on each virtual core. The distinguishing features of ourmore » approach are: (1) customizable granularity of the VM scheduling time unit on the simulation time axis, (2) ability to take arbitrary leaps in virtual time by VMs to maximize the utilization of host (real) cores when guest virtual cores idle, and (3) empirically determinable optimality in the tradeoff between total execution (real) time and time-ordering accuracy levels. Experiments show that it is possible to get nearly perfect time-ordered execution, with a slight cost in total run time, relative to optimized non-simulation VM schedulers. Interestingly, with our time-ordered scheduler, it is also possible to reduce the time-ordering error from over 50% of non-simulation scheduler to less than 1% realized by our scheduler, with almost the same run time efficiency as that of the highly efficient non-simulation VM schedulers.« less

  11. Exhaust emission survey of an F100 afterburning turbofan engine at simulated altitude flight conditions

    NASA Technical Reports Server (NTRS)

    Moss, J. E.; Cullom, R. R.

    1981-01-01

    Emissions of carbon monoxide, total oxides of nitrogen, unburned hydrocarbons, and carbon dioxide from an F100, afterburning, two spool turbofan engine at simulated flight conditions are reported. For each flight condition emission measurements were made for two or three power levels from intermediate power (nonafterburning) through maximum afterburning. The data showed that emissions vary with flight speed, altitude, power level, and radial position across the nozzle. Carbon monoxide emissions were low for intermediate power (nonafterburning) and partial afterburning, but regions of high carbon monoxide were present downstream of the flame holder at maximum afterburning. Unburned hydrocarbon emissions were low for most of the simulated flight conditions. The local NOX concentrations and their variability with power level increased with increasing flight Mach number at constant altitude, and decreased with increasing altitude at constant Mach number. Carbon dioxide emissions were proportional to local fuel air ratio for all conditions.

  12. Verifying and Validating Simulation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2015-02-23

    This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less

  13. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Schutte, Paul C.; Trujillo, Anna; Pritchett, Amy R.

    2000-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plug-in' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  14. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    NASA Technical Reports Server (NTRS)

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  15. Comparative evaluation of twenty pilot workload assessment measure using a psychomotor task in a moving base aircraft simulator

    NASA Technical Reports Server (NTRS)

    Connor, S. A.; Wierwille, W. W.

    1983-01-01

    A comparison of the sensitivity and intrusion of twenty pilot workload assessment techniques was conducted using a psychomotor loading task in a three degree of freedom moving base aircraft simulator. The twenty techniques included opinion measures, spare mental capacity measures, physiological measures, eye behavior measures, and primary task performance measures. The primary task was an instrument landing system (ILS) approach and landing. All measures were recorded between the outer marker and the middle marker on the approach. Three levels (low, medium, and high) of psychomotor load were obtained by the combined manipulation of windgust disturbance level and simulated aircraft pitch stability. Six instrument rated pilots participated in four seasons lasting approximately three hours each.

  16. Use of simulation-based learning in undergraduate nurse education: An umbrella systematic review.

    PubMed

    Cant, Robyn P; Cooper, Simon J

    2017-02-01

    To conduct a systematic review to appraise and review evidence on the impact of simulation-based education for undergraduate/pre-licensure nursing students, using existing reviews of literature. An umbrella review (review of reviews). Cumulative Index of Nursing and Allied Health Literature (CINAHLPlus), PubMed, and Google Scholar. Reviews of literature conducted between 2010 and 2015 regarding simulation-based education for pre-licensure nursing students. The Joanna Briggs Institute methodology for conduct of an umbrella review was used to inform the review process. Twenty-five systematic reviews of literature were included, of which 14 were recent (2013-2015). Most described the level of evidence of component studies as a mix of experimental and quasi-experimental designs. The reviews measured around 14 different main outcome variables, thus limiting the number of primary studies that each individual review could pool to appraise. Many reviews agreed on the key learning outcome of knowledge acquisition, although no overall quantitative effect was derived. Three of four high-quality reviews found that simulation supported psychomotor development; a fourth found too few high quality studies to make a statistical comparison. Simulation statistically improved self-efficacy in pretest-posttest studies, and in experimental designs self-efficacy was superior to that of other teaching methods; lower level research designs limiting further comparison. The reviews commonly reported strong student satisfaction with simulation education and some reported improved confidence and/or critical thinking. This umbrella review took a global view of 25 reviews of simulation research in nursing education, comprising over 700 primary studies. To discern overall outcomes across reviews, statistical comparison of quantitative results (effect size) must be the key comparator. Simulation-based education contributes to students' learning in a number of ways when integrated into pre-licensure nursing curricula. Overall, use of a constellation of instruments and a lack of high quality study designs mean that there are still some gaps in evidence of effects that need to be addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Characterization of the faulted behavior of digital computers and fault tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, Salvatore J.; Miner, Paul S.

    1989-01-01

    A development status evaluation is presented for efforts conducted at NASA-Langley since 1977, toward the characterization of the latent fault in digital fault-tolerant systems. Attention is given to the practical, high speed, generalized gate-level logic system simulator developed, as well as to the validation methodology used for the simulator, on the basis of faultable software and hardware simulations employing a prototype MIL-STD-1750A processor. After validation, latency tests will be performed.

  18. Analysis of a four lamp flash system for calibrating multi-junction solar cells under concentrated light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schachtner, Michael, E-mail: michael.schachtner@ise.fraunhofer.de; Prado, Marcelo Loyo; Reichmuth, S. Kasimir

    2015-09-28

    It has been known for a long time that the precise characterization of multi-junction solar cells demands spectrally tunable solar simulators. The calibration of innovative multi-junction solar cells for CPV applications now requires tunable solar simulators which provide high irradiation levels. This paper describes the commissioning and calibration of a flash-based four-lamp simulator to be used for the measurement of multi-junction solar cells with up to four subcells under concentrated light.

  19. The impact of constructive feedback on training in gastrointestinal endoscopy using high-fidelity Virtual-Reality simulation: a randomised controlled trial.

    PubMed

    Kruglikova, Irina; Grantcharov, Teodor P; Drewes, Asbjorn M; Funch-Jensen, Peter

    2010-02-01

    Recently, virtual reality computer simulators have been used to enhance traditional endoscopy teaching. Previous studies have demonstrated construct validity of these systems and transfer of virtual skills to the operating room. However, to date no simulator-training curricula have been designed and there is very little evidence on the impact of external feedback on acquisition of endoscopic skills. The aim of the present study was to assess the impact of external feedback on the learning curves on a VR colonoscopy simulator using inexperienced trainees. 22 trainees, without colonoscopy experience were randomised to a group which received structured feedback provided by an experienced supervisor and a controlled group. All participants performed 15 repetitions of task 3 from the Introduction colonoscopy module of the Accu Touch Endoscopy simulator. Retention/transfer tests on simulator were performed 4-6 weeks after the last repetition. The proficiency levels were based on the performance of eight experienced colonoscopists. All subjects were able to complete the procedure on the simulator. There were no perforations in the feedback group versus seven in the non-feedback group. Subjects in the feedback group reached expert proficiency levels in percentage of mucosa visualised and time to reach the caecum significantly faster compared with the control group. None of the groups demonstrated significant degradation of performance in simulator retention/transfer tests. Concurrent feedback given by supervisor concur an advantage in acquisition of basic colonoscopy skills and achieving of proficiency level as compared to independent training.

  20. A Novel Low-Ringing Monocycle Picosecond Pulse Generator Based on Step Recovery Diode

    PubMed Central

    Zhou, Jianming; Yang, Xiao; Lu, Qiuyuan; Liu, Fan

    2015-01-01

    This paper presents a high-performance low-ringing ultra-wideband monocycle picosecond pulse generator, formed using a step recovery diode (SRD), simulated in ADS software and generated through experimentation. The pulse generator comprises three parts, a step recovery diode, a field-effect transistor and a Schottky diode, used to eliminate the positive and negative ringing of pulse. Simulated results validate the design. Measured results indicate an output waveform of 1.88 peak-to-peak amplitude and 307ps pulse duration with a minimal ringing of -22.5 dB, providing good symmetry and low level of ringing. A high degree of coordination between the simulated and measured results is achieved. PMID:26308450

  1. QuVis interactive simulations: tools to support quantum mechanics instruction

    NASA Astrophysics Data System (ADS)

    Kohnle, Antje

    2015-04-01

    Quantum mechanics holds a fascination for many students, but its mathematical complexity and counterintuitive results can present major barriers. The QuVis Quantum Mechanics Visualization Project (www.st-andrews.ac.uk/physics/quvis) aims to overcome these issues through the development and evaluation of interactive simulations with accompanying activities for the learning and teaching of quantum mechanics. Over 90 simulations are now available on the QuVis website. One collection of simulations is embedded in the Institute of Physics Quantum Physics website (quantumphysics.iop.org), which consists of freely available resources for an introductory course in quantum mechanics starting from two-level systems. Simulations support model-building by reducing complexity, focusing on fundamental ideas and making the invisible visible. They promote engaged exploration, sense-making and linking of multiple representations, and include high levels of interactivity and direct feedback. Simulations are research-based and evaluation with students informs all stages of the development process. Simulations are iteratively refined using student feedback in individual observation sessions and in-class trials. Evaluation has shown that the simulations can help students learn quantum mechanics concepts at both the introductory and advanced undergraduate level and that students perceive simulations to be beneficial to their learning. Recent activity includes the launch of a new collection of HTML5 simulations that run on both desktop and tablet-based devices and the introduction of a goal and reward structure in simulations through the inclusion of challenges. This presentation will give an overview of the QuVis resources, highlight recent work and outline future plans. QuVis is supported by the UK Institute of Physics, the UK Higher Education Academy and the University of St Andrews.

  2. Barriers and enablers to the use of high-fidelity patient simulation manikins in nurse education: an integrative review.

    PubMed

    Al-Ghareeb, Amal Z; Cooper, Simon J

    2016-01-01

    This integrative review identified, critically appraised and synthesised the existing evidence on the barriers and enablers to using high-fidelity human patient simulator manikins (HPSMs) in undergraduate nursing education. In nursing education, specifically at the undergraduate level, a range of low to high-fidelity simulations have been used as teaching aids. However, nursing educators encounter challenges when introducing new teaching methods or technology, despite the prevalence of high-fidelity HPSMs in nursing education. An integrative review adapted a systematic approach. Medline, CINAHL plus, ERIC, PsychINFO, EMBASE, SCOPUS, Science Direct, Cochrane database, Joanna Brigge Institute, ProQuest, California Simulation Alliance, Simulation Innovative Recourses Center and the search engine Google Scholar were searched. Keywords were selected and specific inclusion/exclusion criteria were applied. The review included all research designs for papers published between 2000 and 2015 that identified the barriers and enablers to using high-fidelity HPSMs in undergraduate nursing education. Studies were appraised using the Critical Appraisal Skills Programme criteria. Thematic analysis was undertaken and emergent themes were extracted. Twenty-one studies were included in the review. These studies adopted quasi-experimental, prospective non-experimental and descriptive designs. Ten barriers were identified, including "lack of time," "fear of technology" and "workload issues." Seven enablers were identified, including "faculty training," "administrative support" and a "dedicated simulation coordinator." Barriers to simulation relate specifically to the complex technologies inherent in high-fidelity HPSMs approaches. Strategic approaches that support up-skilling and provide dedicated technological support may overcome these barriers. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Rater Training to Support High-Stakes Simulation-Based Assessments

    PubMed Central

    Feldman, Moshe; Lazzara, Elizabeth H.; Vanderbilt, Allison A.; DiazGranados, Deborah

    2013-01-01

    Competency-based assessment and an emphasis on obtaining higher-level outcomes that reflect physicians’ ability to demonstrate their skills has created a need for more advanced assessment practices. Simulation-based assessments provide medical education planners with tools to better evaluate the 6 Accreditation Council for Graduate Medical Education (ACGME) and American Board of Medical Specialties (ABMS) core competencies by affording physicians opportunities to demonstrate their skills within a standardized and replicable testing environment, thus filling a gap in the current state of assessment for regulating the practice of medicine. Observational performance assessments derived from simulated clinical tasks and scenarios enable stronger inferences about the skill level a physician may possess, but also introduce the potential of rater errors into the assessment process. This article reviews the use of simulation-based assessments for certification, credentialing, initial licensure, and relicensing decisions and describes rater training strategies that may be used to reduce rater errors, increase rating accuracy, and enhance the validity of simulation-based observational performance assessments. PMID:23280532

  4. SIGNUM: A Matlab, TIN-based landscape evolution model

    NASA Astrophysics Data System (ADS)

    Refice, A.; Giachetta, E.; Capolongo, D.

    2012-08-01

    Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.

  5. Noise of a model high speed counterrotation propeller at simulated takeoff/approach conditions (F7/A7)

    NASA Technical Reports Server (NTRS)

    Woodward, Richard P.

    1987-01-01

    A high speed advanced counterrotation propeller, was tested in the NASA-Lewis 9 x 15 foot Anechoic Wind Tunnel at simulated takeoff/approach conditions of 0.2 Mach number. Acoustic measurements were taken with fixed floor microphones, an axially translating microphone probe, and with a polar microphone probe which was fixed to the propeller nacelle and could take both sideline and circumferential acoustic surveys. Aerodynamic measurements were also made to establish the propeller operating conditions. The propeller was run over a range of blade setting angles from 36.4/36.5 to 41.1/39.4 deg, tip speeds from 165 to 259 m/sec, rotor spacings from 1.56 to 3.63 based on forward rotor tip chord to aerodynamic separation, and angles of attack to + or - 16 deg. First order rotor alone tones showed highest directivity levels near the propeller plane, while interaction tone showed high levels throughout sideline directivity, especially toward the propeller rotation axis. Interaction tone levels were sensitive to propeller row spacing while rotor alone tones showed little spacing effect. There is a decreased noise level associated with higher propeller blade numbers for the same overall propeller thrust.

  6. A proposal for an open source graphical environment for simulating x-ray optics

    NASA Astrophysics Data System (ADS)

    Sanchez del Rio, Manuel; Rebuffi, Luca; Demsar, Janez; Canestrari, Niccolo; Chubar, Oleg

    2014-09-01

    A new graphic environment to drive X-ray optics simulation packages such as SHADOW and SRW is proposed. The aim is to simulate a virtual experiment, including the description of the electron beam and simulate the emitted radiation, the optics, the scattering by the sample and radiation detection. Python is chosen as common interaction language. The ingredients of the new application, a glossary of variables for optical component, the selection of visualization tools, and the integration of all these components in a high level workflow environment built on Orange are presented.

  7. NASA Constellation Distributed Simulation Middleware Trade Study

    NASA Technical Reports Server (NTRS)

    Hasan, David; Bowman, James D.; Fisher, Nancy; Cutts, Dannie; Cures, Edwin Z.

    2008-01-01

    This paper presents the results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL.

  8. A TREETOPS simulation of the Hubble Space Telescope-High Gain Antenna interaction

    NASA Technical Reports Server (NTRS)

    Sharkey, John P.

    1987-01-01

    Virtually any project dealing with the control of a Large Space Structure (LSS) will involve some level of verification by digital computer simulation. While the Hubble Space Telescope might not normally be included in a discussion of LSS, it is presented to highlight a recently developed simulation and analysis program named TREETOPS. TREETOPS provides digital simulation, linearization, and control system interaction of flexible, multibody spacecraft which admit to a point-connected tree topology. The HST application of TREETOPS is intended to familiarize the LSS community with TREETOPS by presenting a user perspective of its key features.

  9. Noise levels in PICU: an evaluative study.

    PubMed

    Bailey, Elizabeth; Timmons, Stephen

    2005-12-01

    High levels of noise in the hospital environment can have an impact on patients and staff increasing both recovery time and stress respectively. When our seven-bedded paediatric intensive care unit (PICU) is full, noise levels seem to increase significantly. This study measured noise levels at various times and places within a PICU using Tenma sound level meter which simulates the subjective response of a human ear. Noise levels were often excessive, exceeding international guidelines. Staff conversation was responsible for most of the noise produced; medical equipment, patient interventions, telephones, doorbell and the air shoot system were also responsible for causing high levels of noise. More can be done to reduce noise and its effects on patients and staff.

  10. Error analysis of high-rate GNSS precise point positioning for seismic wave measurement

    NASA Astrophysics Data System (ADS)

    Shu, Yuanming; Shi, Yun; Xu, Peiliang; Niu, Xiaoji; Liu, Jingnan

    2017-06-01

    High-rate GNSS precise point positioning (PPP) has been playing a more and more important role in providing precise positioning information in fast time-varying environments. Although kinematic PPP is commonly known to have a precision of a few centimeters, the precision of high-rate PPP within a short period of time has been reported recently with experiments to reach a few millimeters in the horizontal components and sub-centimeters in the vertical component to measure seismic motion, which is several times better than the conventional kinematic PPP practice. To fully understand the mechanism of mystified excellent performance of high-rate PPP within a short period of time, we have carried out a theoretical error analysis of PPP and conducted the corresponding simulations within a short period of time. The theoretical analysis has clearly indicated that the high-rate PPP errors consist of two types: the residual systematic errors at the starting epoch, which affect high-rate PPP through the change of satellite geometry, and the time-varying systematic errors between the starting epoch and the current epoch. Both the theoretical error analysis and simulated results are fully consistent with and thus have unambiguously confirmed the reported high precision of high-rate PPP, which has been further affirmed here by the real data experiments, indicating that high-rate PPP can indeed achieve the millimeter level of precision in the horizontal components and the sub-centimeter level of precision in the vertical component to measure motion within a short period of time. The simulation results have clearly shown that the random noise of carrier phases and higher order ionospheric errors are two major factors to affect the precision of high-rate PPP within a short period of time. The experiments with real data have also indicated that the precision of PPP solutions can degrade to the cm level in both the horizontal and vertical components, if the geometry of satellites is rather poor with a large DOP value.

  11. Voice measures of workload in the advanced flight deck: Additional studies

    NASA Technical Reports Server (NTRS)

    Schneider, Sid J.; Alpert, Murray

    1989-01-01

    These studies investigated acoustical analysis of the voice as a measure of workload in individual operators. In the first study, voice samples were recorded from a single operator during high, medium, and low workload conditions. Mean amplitude, frequency, syllable duration, and emphasis all tended to increase as workload increased. In the second study, NASA test pilots performed a laboratory task, and used a flight simulator under differing work conditions. For two of the pilots, high workload in the simulator brought about greater amplitude, peak duration, and stress. In both the laboratory and simulator tasks, high workload tended to be associated with more statistically significant drop-offs in the acoustical measures than were lower workload levels. There was a great deal of intra-subject variability in the acoustical measures. The results suggested that in individual operators, increased workload might be revealed by high initial amplitude and frequency, followed by rapid drop-offs over time.

  12. Radio-Frequency Tank Eigenmode Sensor for Propellant Quantity Gauging

    NASA Technical Reports Server (NTRS)

    Zimmerli, Gregory A.; Buchanan, David A.; Follo, Jeffrey C.; Vaden, Karl R.; Wagner, James D.; Asipauskas, Marius; Herlacher, Michael D.

    2010-01-01

    Although there are several methods for determining liquid level in a tank, there are no proven methods to quickly gauge the amount of propellant in a tank while it is in low gravity or under low-settling thrust conditions where propellant sloshing is an issue. Having the ability to quickly and accurately gauge propellant tanks in low-gravity is an enabling technology that would allow a spacecraft crew or mission control to always know the amount of propellant onboard, thus increasing the chances for a successful mission. The Radio Frequency Mass Gauge (RFMG) technique measures the electromagnetic eigenmodes, or natural resonant frequencies, of a tank containing a dielectric fluid. The essential hardware components consist of an RF network analyzer that measures the reflected power from an antenna probe mounted internal to the tank. At a resonant frequency, there is a drop in the reflected power, and these inverted peaks in the reflected power spectrum are identified as the tank eigenmode frequencies using a peak-detection software algorithm. This information is passed to a pattern-matching algorithm, which compares the measured eigenmode frequencies with a database of simulated eigenmode frequencies at various fill levels. A best match between the simulated and measured frequency values occurs at some fill level, which is then reported as the gauged fill level. The database of simulated eigenmode frequencies is created by using RF simulation software to calculate the tank eigenmodes at various fill levels. The input to the simulations consists of a fairly high-fidelity tank model with proper dimensions and including internal tank hardware, the dielectric properties of the fluid, and a defined liquid/vapor interface. Because of small discrepancies between the model and actual hardware, the measured empty tank spectra and simulations are used to create a set of correction factors for each mode (typically in the range of 0.999 1.001), which effectively accounts for the small discrepancies. These correction factors are multiplied to the modes at all fill levels. By comparing several measured modes with the simulations, it is possible to accurately gauge the amount of propellant in the tank. An advantage of the RFMG approach of applying computer simulations and a pattern-matching algorithm is that the Although there are several methods for determining liquid level in a tank, there are no proven methods to quickly gauge the amount of propellant in a tank while it is in low gravity or under low-settling thrust conditions where propellant sloshing is an issue. Having the ability to quickly and accurately gauge propellant tanks in low-gravity is an enabling technology that would allow a spacecraft crew or mission control to always know the amount of propellant onboard, thus increasing the chances for a successful mission. The Radio Frequency Mass Gauge (RFMG) technique measures the electromagnetic eigenmodes, or natural resonant frequencies, of a tank containing a dielectric fluid. The essential hardware components consist of an RF network analyzer that measures the reflected power from an antenna probe mounted internal to the tank. At a resonant frequency, there is a drop in the reflected power, and these inverted peaks in the reflected power spectrum are identified as the tank eigenmode frequencies using a peak-detection software algorithm. This information is passed to a pattern-matching algorithm, which compares the measured eigenmode frequencies with a database of simulated eigenmode frequencies at various fill levels. A best match between the simulated and measured frequency values occurs at some fill level, which is then reported as the gauged fill level. The database of simulated eigenmode frequencies is created by using RF simulation software to calculate the tank eigenmodes at various fill levels. The input to the simulations consists of a fairly high-fidelity tank model with proper dimensions and including internal tank harare, the dielectric properties of the fluid, and a defined liquid/vapor interface. Because of small discrepancies between the model and actual hardware, the measured empty tank spectra and simulations are used to create a set of correction factors for each mode (typically in the range of 0.999 1.001), which effectively accounts for the small discrepancies. These correction factors are multiplied to the modes at all fill levels. By comparing several measured modes with the simulations, it is possible to accurately gauge the amount of propellant in the tank. An advantage of the RFMG approach of applying computer simulations and a pattern-matching algorithm is that the

  13. Numerical models for fluid-grains interactions: opportunities and limitations

    NASA Astrophysics Data System (ADS)

    Esteghamatian, Amir; Rahmani, Mona; Wachs, Anthony

    2017-06-01

    In the framework of a multi-scale approach, we develop numerical models for suspension flows. At the micro scale level, we perform particle-resolved numerical simulations using a Distributed Lagrange Multiplier/Fictitious Domain approach. At the meso scale level, we use a two-way Euler/Lagrange approach with a Gaussian filtering kernel to model fluid-solid momentum transfer. At both the micro and meso scale levels, particles are individually tracked in a Lagrangian way and all inter-particle collisions are computed by a Discrete Element/Soft-sphere method. The previous numerical models have been extended to handle particles of arbitrary shape (non-spherical, angular and even non-convex) as well as to treat heat and mass transfer. All simulation tools are fully-MPI parallel with standard domain decomposition and run on supercomputers with a satisfactory scalability on up to a few thousands of cores. The main asset of multi scale analysis is the ability to extend our comprehension of the dynamics of suspension flows based on the knowledge acquired from the high-fidelity micro scale simulations and to use that knowledge to improve the meso scale model. We illustrate how we can benefit from this strategy for a fluidized bed, where we introduce a stochastic drag force model derived from micro-scale simulations to recover the proper level of particle fluctuations. Conversely, we discuss the limitations of such modelling tools such as their limited ability to capture lubrication forces and boundary layers in highly inertial flows. We suggest ways to overcome these limitations in order to enhance further the capabilities of the numerical models.

  14. A gyrokinetic perspective on the JET-ILW pedestal

    NASA Astrophysics Data System (ADS)

    Hatch, D. R.; Kotschenreuther, M.; Mahajan, S.; Valanju, P.; Liu, X.

    2017-03-01

    JET has been unable to recover historical confinement levels when operating with an ITER-like wall (ILW) due largely to the inaccessibility of high pedestal temperatures. Finding a path to overcome this challenge is of utmost importance for both a prospective JET DT campaign and for future ITER operation. Gyrokinetic simulations (using the Gene code) quantitatively capture experimental transport levels for a representative experimental discharge and qualitatively recover the major experimental trends. Microtearing turbulence is a major transport mechanisms for the low-temperature pedestals characteristic of unseeded JET-ILW discharges. At higher temperatures and/or lower {ρ\\ast} , we identify electrostatic ITG transport of a type that is strongly shear-suppressed on smaller machines. Consistent with observations, this transport mechanism is strongly reduced by the presence of a low-Z impurity (e.g. carbon or nitrogen at the level of {{Z}\\text{eff}}∼ 2 ), recovering the accessibility of high pedestal temperatures. Notably, simulations based on dimensionless {ρ\\ast} scans recover historical scaling behavior except in the unique JET-ILW parameter regime where ITG turbulence becomes important. Our simulations also elucidate the observed degradation of confinement caused by gas puffing, emphasizing the important role of the density pedestal structure. This study maps out important regions of parameter space, providing insights that may point to optimal physical regimes that can enable the recovery of high pedestal temperatures on JET.

  15. International Meeting on Simulation in Healthcare

    DTIC Science & Technology

    2010-02-01

    wounds, burns, and injury . Participants will create reusable moulage items using realistic gel effects materials—designed to work seamlessly with...simulations of injuries and clinical encounters. Such technology provides extremely high levels of perceived realism and encourages suspension of disbelief...trace. The model gives an estimate of the cerebral flow reduction that occurs during early decelerations, including an estimate for vessel diameter

  16. Geostrophic Vortex Dynamics

    DTIC Science & Technology

    1988-10-01

    Generalized Kirchhoff Vortices 176 B. The 2-Level Rankine Vortex: Critical Points & Stability 181 C. Tripolar Coherent Euler Vortices 186 7...spontaneously in spectral simulations. One such example is provided by the tripolar vortex structureE which will be examined in detail in Chapter 6. It...of the tripolar coherent vortex structures that have recently been observed in very high resolution numerical simulations of two- dimensional

  17. Low salinity and high-level UV-B radiation reduce single-cell activity in antarctic sea ice bacteria.

    PubMed

    Martin, Andrew; Hall, Julie; Ryan, Ken

    2009-12-01

    Experiments simulating the sea ice cycle were conducted by exposing microbes from Antarctic fast ice to saline and irradiance regimens associated with the freeze-thaw process. In contrast to hypersaline conditions (ice formation), the simulated release of bacteria into hyposaline seawater combined with rapid exposure to increased UV-B radiation significantly reduced metabolic activity.

  18. Comparing Simulated and Experimental Data from UCN τ

    NASA Astrophysics Data System (ADS)

    Howard, Dezrick; Holley, Adam

    2017-09-01

    The UCN τ experiment is designed to measure the average lifetime of a free neutron (τn) by trapping ultracold neutrons (UCN) in a magneto-gravitational trap and allowing them to β-decay, with the ultimate goal of minimizing the uncertainty to approximately 0.01% (0.1 s). Understanding the systematics of the experiment at the level necessary to reach this high precision may help to better understand the disparity between measurements from cold neutron beam and UCN bottle experiments (τn 888 s and τn 878 s, respectively). To assist in evaluating systemics that might conceivably contribute at this level, a neutron spin-tracking Monte Carlo simulation, which models a UCN population's behavior throughout a run, is currently under development. The simulation will utilize an empirical map of the magnetic field in the trap (see poster by K. Hoffman) by interpolating the field between measured points (see poster by J. Felkins) in order to model the depolarization mechanism with high fidelity. As a preliminary step, I have checked that the Monte Carlo model can reasonably reproduce the observed behavior of the experiment. In particular, I will present a comparison between simulated data and data acquired from the 2016-2017 UCN τ run cycle.

  19. Multi-level methods and approximating distribution functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, D., E-mail: daniel.wilson@dtc.ox.ac.uk; Baker, R. E.

    2016-07-15

    Biochemical reaction networks are often modelled using discrete-state, continuous-time Markov chains. System statistics of these Markov chains usually cannot be calculated analytically and therefore estimates must be generated via simulation techniques. There is a well documented class of simulation techniques known as exact stochastic simulation algorithms, an example of which is Gillespie’s direct method. These algorithms often come with high computational costs, therefore approximate stochastic simulation algorithms such as the tau-leap method are used. However, in order to minimise the bias in the estimates generated using them, a relatively small value of tau is needed, rendering the computational costs comparablemore » to Gillespie’s direct method. The multi-level Monte Carlo method (Anderson and Higham, Multiscale Model. Simul. 10:146–179, 2012) provides a reduction in computational costs whilst minimising or even eliminating the bias in the estimates of system statistics. This is achieved by first crudely approximating required statistics with many sample paths of low accuracy. Then correction terms are added until a required level of accuracy is reached. Recent literature has primarily focussed on implementing the multi-level method efficiently to estimate a single system statistic. However, it is clearly also of interest to be able to approximate entire probability distributions of species counts. We present two novel methods that combine known techniques for distribution reconstruction with the multi-level method. We demonstrate the potential of our methods using a number of examples.« less

  20. Development of the functional simulator for the Galileo attitude and articulation control system

    NASA Technical Reports Server (NTRS)

    Namiri, M. K.

    1983-01-01

    A simulation program for verifying and checking the performance of the Galileo Spacecraft's Attitude and Articulation Control Subsystem's (AACS) flight software is discussed. The program, which is called Functional Simulator (FUNSIM), provides a simple method of interfacing user-supplied mathematical models coded in FORTRAN which describes spacecraft dynamics, sensors, and actuators; this is done with the AACS flight software, coded in HAL/S (High-level Advanced Language/Shuttle). It is thus able to simulate the AACS flight software accurately to the HAL/S statement level in the environment of a mainframe computer system. FUNSIM also has a command and data subsystem (CDS) simulator. It is noted that the input/output data and timing are simulated with the same precision as the flight microprocessor. FUNSIM uses a variable stepsize numerical integration algorithm complete with individual error bound control on the state variable to solve the equations of motion. The program has been designed to provide both line printer and matrix dot plotting of the variables requested in the run section and to provide error diagnostics.

  1. Review of hardware-in-the-loop simulation and its prospects in the automotive area

    NASA Astrophysics Data System (ADS)

    Fathy, Hosam K.; Filipi, Zoran S.; Hagena, Jonathan; Stein, Jeffrey L.

    2006-05-01

    Hardware-in-the-loop (HIL) simulation is rapidly evolving from a control prototyping tool to a system modeling, simulation, and synthesis paradigm synergistically combining many advantages of both physical and virtual prototyping. This paper provides a brief overview of the key enablers and numerous applications of HIL simulation, focusing on its metamorphosis from a control validation tool into a system development paradigm. It then describes a state-of-the art engine-in-the-loop (EIL) simulation facility that highlights the use of HIL simulation for the system-level experimental evaluation of powertrain interactions and development of strategies for clean and efficient propulsion. The facility comprises a real diesel engine coupled to accurate real-time driver, driveline, and vehicle models through a highly responsive dynamometer. This enables the verification of both performance and fuel economy predictions of different conventional and hybrid powertrains. Furthermore, the facility can both replicate the highly dynamic interactions occurring within a real powertrain and measure their influence on transient emissions and visual signature through state-of-the-art instruments. The viability of this facility for integrated powertrain system development is demonstrated through a case study exploring the development of advanced High Mobility Multipurpose Wheeled Vehicle (HMMWV) powertrains.

  2. Simulation of Turbine Tone Noise Generation Using a Turbomachinery Aerodynamics Solver

    NASA Technical Reports Server (NTRS)

    VanZante, Dale; Envia, Edmane

    2010-01-01

    As turbofan engine bypass ratios continue to increase, the contribution of the turbine to the engine noise signature is receiving more attention. Understanding the relative importance of the various turbine noise generation mechanisms and the characteristics of the turbine acoustic transmission loss are essential ingredients in developing robust reduced-order models for predicting the turbine noise signature. A computationally based investigation has been undertaken to help guide the development of a turbine noise prediction capability that does not rely on empiricism. As proof-of-concept for this approach, two highly detailed numerical simulations of the unsteady flow field inside the first stage of a modern high-pressure turbine were carried out. The simulations were computed using TURBO, which is an unsteady Reynolds-Averaged Navier-Stokes code capable of multi-stage simulations. Spectral and modal analysis of the unsteady pressure data from the numerical simulation of the turbine stage show a circumferential modal distribution that is consistent with the Tyler-Sofrin rule. Within the high-pressure turbine, the interaction of velocity, pressure and temperature fluctuations with the downstream blade rows are all possible tone noise source mechanisms. We have taken the initial step in determining the source strength hierarchy by artificially reducing the level of temperature fluctuations in the turbine flowfield. This was accomplished by changing the vane cooling flow temperature in order to mitigate the vane thermal wake in the second of the two simulations. The results indicated that, despite a dramatic change in the vane cooling flow, the computed modal levels changed very little indicating that the contribution of temperature fluctuations to the overall pressure field is rather small compared with the viscous and potential field interaction mechanisms.

  3. GROMACS: High performance molecular simulations through multi-level parallelism from laptops to supercomputers

    DOE PAGES

    Abraham, Mark James; Murtola, Teemu; Schulz, Roland; ...

    2015-07-15

    GROMACS is one of the most widely used open-source and free software codes in chemistry, used primarily for dynamical simulations of biomolecules. It provides a rich set of calculation types, preparation and analysis tools. Several advanced techniques for free-energy calculations are supported. In version 5, it reaches new performance heights, through several new and enhanced parallelization algorithms. This work on every level; SIMD registers inside cores, multithreading, heterogeneous CPU–GPU acceleration, state-of-the-art 3D domain decomposition, and ensemble-level parallelization through built-in replica exchange and the separate Copernicus framework. Finally, the latest best-in-class compressed trajectory storage format is supported.

  4. GROMACS: High performance molecular simulations through multi-level parallelism from laptops to supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abraham, Mark James; Murtola, Teemu; Schulz, Roland

    GROMACS is one of the most widely used open-source and free software codes in chemistry, used primarily for dynamical simulations of biomolecules. It provides a rich set of calculation types, preparation and analysis tools. Several advanced techniques for free-energy calculations are supported. In version 5, it reaches new performance heights, through several new and enhanced parallelization algorithms. This work on every level; SIMD registers inside cores, multithreading, heterogeneous CPU–GPU acceleration, state-of-the-art 3D domain decomposition, and ensemble-level parallelization through built-in replica exchange and the separate Copernicus framework. Finally, the latest best-in-class compressed trajectory storage format is supported.

  5. [Acquiring skills in malignant hyperthermia crisis management: comparison of high-fidelity simulation versus computer-based case study].

    PubMed

    Mejía, Vilma; Gonzalez, Carlos; Delfino, Alejandro E; Altermatt, Fernando R; Corvetto, Marcia A

    The primary purpose of this study was to compare the effect of high fidelity simulation versus a computer-based case solving self-study, in skills acquisition about malignant hyperthermia on first year anesthesiology residents. After institutional ethical committee approval, 31 first year anesthesiology residents were enrolled in this prospective randomized single-blinded study. Participants were randomized to either a High Fidelity Simulation Scenario or a computer-based Case Study about malignant hyperthermia. After the intervention, all subjects' performance in was assessed through a high fidelity simulation scenario using a previously validated assessment rubric. Additionally, knowledge tests and a satisfaction survey were applied. Finally, a semi-structured interview was done to assess self-perception of reasoning process and decision-making. 28 first year residents finished successfully the study. Resident's management skill scores were globally higher in High Fidelity Simulation versus Case Study, however they were significant in 4 of the 8 performance rubric elements: recognize signs and symptoms (p = 0.025), prioritization of initial actions of management (p = 0.003), recognize complications (p = 0.025) and communication (p = 0.025). Average scores from pre- and post-test knowledge questionnaires improved from 74% to 85% in the High Fidelity Simulation group, and decreased from 78% to 75% in the Case Study group (p = 0.032). Regarding the qualitative analysis, there was no difference in factors influencing the student's process of reasoning and decision-making with both teaching strategies. Simulation-based training with a malignant hyperthermia high-fidelity scenario was superior to computer-based case study, improving knowledge and skills in malignant hyperthermia crisis management, with a very good satisfaction level in anesthesia residents. Copyright © 2018 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.

  6. Tsunami hazard assessment in the Hudson River Estuary based on dynamic tsunami-tide simulations

    NASA Astrophysics Data System (ADS)

    Shelby, Michael; Grilli, Stéphan T.; Grilli, Annette R.

    2016-12-01

    This work is part of a tsunami inundation mapping activity carried out along the US East Coast since 2010, under the auspice of the National Tsunami Hazard Mitigation program (NTHMP). The US East Coast features two main estuaries with significant tidal forcing, which are bordered by numerous critical facilities (power plants, major harbors,...) as well as densely built low-level areas: Chesapeake Bay and the Hudson River Estuary (HRE). HRE is the object of this work, with specific focus on assessing tsunami hazard in Manhattan, the Hudson and East River areas. In the NTHMP work, inundation maps are computed as envelopes of maximum surface elevation along the coast and inland, by simulating the impact of selected probable maximum tsunamis (PMT) in the Atlantic ocean margin and basin. At present, such simulations assume a static reference level near shore equal to the local mean high water (MHW) level. Here, instead we simulate maximum inundation in the HRE resulting from dynamic interactions between the incident PMTs and a tide, which is calibrated to achieve MHW at its maximum level. To identify conditions leading to maximum tsunami inundation, each PMT is simulated for four different phases of the tide and results are compared to those obtained for a static reference level. We first separately simulate the tide and the three PMTs that were found to be most significant for the HRE. These are caused by: (1) a flank collapse of the Cumbre Vieja Volcano (CVV) in the Canary Islands (with a 80 km3 volume representing the most likely extreme scenario); (2) an M9 coseismic source in the Puerto Rico Trench (PRT); and (3) a large submarine mass failure (SMF) in the Hudson River canyon of parameters similar to the 165 km3 historical Currituck slide, which is used as a local proxy for the maximum possible SMF. Simulations are performed with the nonlinear and dispersive long wave model FUNWAVE-TVD, in a series of nested grids of increasing resolution towards the coast, by one-way coupling. Four levels of nested grids are used, from a 1 arc-min spherical coordinate grid in the deep ocean down to a 39-m Cartesian grid in the HRE. Bottom friction coefficients in the finer grids are calibrated for the tide to achieve the local spatially averaged MHW level at high tide in the HRE. Combined tsunami-tide simulations are then performed for four phases of the tide corresponding to each tsunami arriving at Sandy Hook (NJ): 1.5 h ahead, concurrent with, 1.5 h after, and 3 h after the local high tide. These simulations are forced along the offshore boundary of the third-level grid by linearly superposing time series of surface elevation and horizontal currents of the calibrated tide and each tsunami wave train; this is done in deep enough water for a linear superposition to be accurate. Combined tsunami-tide simulations are then performed with FUNWAVE-TVD in this and the finest nested grids. Results show that, for the 3 PMTs, depending on the tide phase, the dynamic simulations lead to no or to a slightly increased inundation in the HRE (by up to 0.15 m depending on location), and to larger currents than for the simulations over a static level; the CRT SMF proxy tsunami is the PMT leading to maximum inundation in the HRE. For all tide phases, nonlinear interactions between tide and tsunami currents modify the elevation, current, and celerity of tsunami wave trains, mostly in the shallower water areas of the HRE where bottom friction dominates, as compared to a linear superposition of wave elevations and currents. We note that, while dynamic simulations predict a slight increase in inundation, this increase may be on the same order as, or even less than sources of uncertainty in the modeling of tsunami sources, such as their initial water elevation, and in bottom friction and bathymetry used in tsunami grids. Nevertheless, results in this paper provide insight into the magnitude and spatial variability of tsunami propagation and impact in the complex inland waterways surrounding New York City, and of their modification by dynamic tidal effects. We conclude that changes in inundation resulting from the inclusion of a dynamic tide in the specific case of the HRE, although of scientific interest, are not significant for tsunami hazard assessment and that the standard approach of specifying a static reference level equal to MHW is conservative. However, in other estuaries with similarly complex bathymetry/topography and stronger tidal currents, a simplified static approach might not be appropriate.

  7. ZERO-VALENT IRON FOR HIGH-LEVEL ARSENITE REMOVAL

    EPA Science Inventory

    This study conducted by flow through column systems was aimed at investigating the feasibility of using zero-valent iron for arsenic remediation in groundwater. A high concentration arsenic solution (50 mg l-1) was prepared by using sodium arsenite (arsenic (III)) to simulate gr...

  8. A percolation approach to study the high electric field effect on electrical conductivity of insulating polymer

    NASA Astrophysics Data System (ADS)

    Benallou, Amina; Hadri, Baghdad; Martinez-Vega, Juan; El Islam Boukortt, Nour

    2018-04-01

    The effect of percolation threshold on the behaviour of electrical conductivity at high electric field of insulating polymers has been briefly investigated in literature. Sometimes the dead ends links are not taken into account in the study of the electric field effect on the electrical properties. In this work, we present a theoretical framework and Monte Carlo simulation of the behaviour of the electric conductivity at high electric field based on the percolation theory using the traps energies levels which are distributed according to distribution law (uniform, Gaussian, and power-law). When a solid insulating material is subjected to a high electric field, and during trapping mechanism the dead ends of traps affect with decreasing the electric conductivity according to the traps energies levels, the correlation length of the clusters, the length of the dead ends, and the concentration of the accessible positions for the electrons. A reasonably good agreement is obtained between simulation results and the theoretical framework.

  9. Composite Load Spectra for Select Space Propulsion Structural Components

    NASA Technical Reports Server (NTRS)

    Ho, Hing W.; Newell, James F.

    1994-01-01

    Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.

  10. Taxiing, Take-Off, and Landing Simulation of the High Speed Civil Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Reaves, Mercedes C.; Horta, Lucas G.

    1999-01-01

    The aircraft industry jointly with NASA is studying enabling technologies for higher speed, longer range aircraft configurations. Higher speeds, higher temperatures, and aerodynamics are driving these newer aircraft configurations towards long, slender, flexible fuselages. Aircraft response during ground operations, although often overlooked, is a concern due to the increased fuselage flexibility. This paper discusses modeling and simulation of the High Speed Civil Transport aircraft during taxiing, take-off, and landing. Finite element models of the airframe for various configurations are used and combined with nonlinear landing gear models to provide a simulation tool to study responses to different ground input conditions. A commercial computer simulation program is used to numerically integrate the equations of motion and to compute estimates of the responses using an existing runway profile. Results show aircraft responses exceeding safe acceptable human response levels.

  11. Transient rotordynamic analysis for the space-shuttle main engine high-pressure oxygen turbopump

    NASA Technical Reports Server (NTRS)

    Childs, D. W.

    1974-01-01

    A simulation study was conducted to examine the transient rotordynamics of the space shuttle main engine (SSME) high pressure oxygen turbopump (HPOTP) with the objective of identifying, anticipating, and avoiding rotordynamic problem areas. Simulations were performed for steady state operations at emergency power levels and for critical speed transitions. No problems are indicated in steady state operation of the HPOTP emergency power levels, although the results indicated that a rubbing condition will be experienced during critical speed transition at shutdown, particularly involving rotor deceleration rate and imbalance distribution rubbing at the turbine floating-ring seals. The condition is correctable by either reducing the imbalance at the HPOTP hot gas turbine wheels, or by a more rapid deceleration of the rotor through it critical speed.

  12. SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, M; Russell Eibling, R; David Koopman, D

    2007-09-04

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratiomore » of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.« less

  13. Extended interaction oversized coaxial relativistic klystron amplifier with gigawatt-level output at Ka band

    NASA Astrophysics Data System (ADS)

    Li, Shifeng; Duan, Zhaoyun; Huang, Hua; Liu, Zhenbang; He, Hu; Wang, Fei; Wang, Zhanliang; Gong, Yubin

    2018-04-01

    In this paper, an extended interaction oversized coaxial relativistic klystron amplifier (EIOC-RKA) with Gigawatt-level output at Ka band is proposed. We introduce the oversized coaxial and multi-gap resonant cavities to increase the power capacity and investigate a non-uniform extended interaction output cavity to improve the electronic efficiency of the EIOC-RKA. We develop a high order mode gap in the input and output cavities to easily design and fabricate the input and output couplers. Meanwhile, we design the EIOC-RKA by using the particle-in-cell simulation. In the simulations, we use an electron beam with a current of 6 kA and a voltage of 525 kV, which is focused by a low focusing magnetic flux intensity of 0.5 T. The simulation results demonstrate that the saturated output power is 1.17 GW, the electronic efficiency is 37.1%, and the saturated gain is 57 dB at 30 GHz. The self-oscillation is suppressed by adopting the absorbing materials. The proposed EIOC-RKA has plenty of advantages such as large power capacity, high electronic efficiency, low focusing magnetic, high gain, and simple structure.

  14. Petascale computation of multi-physics seismic simulations

    NASA Astrophysics Data System (ADS)

    Gabriel, Alice-Agnes; Madden, Elizabeth H.; Ulrich, Thomas; Wollherr, Stephanie; Duru, Kenneth C.

    2017-04-01

    Capturing the observed complexity of earthquake sources in concurrence with seismic wave propagation simulations is an inherently multi-scale, multi-physics problem. In this presentation, we present simulations of earthquake scenarios resolving high-detail dynamic rupture evolution and high frequency ground motion. The simulations combine a multitude of representations of model complexity; such as non-linear fault friction, thermal and fluid effects, heterogeneous fault stress and fault strength initial conditions, fault curvature and roughness, on- and off-fault non-elastic failure to capture dynamic rupture behavior at the source; and seismic wave attenuation, 3D subsurface structure and bathymetry impacting seismic wave propagation. Performing such scenarios at the necessary spatio-temporal resolution requires highly optimized and massively parallel simulation tools which can efficiently exploit HPC facilities. Our up to multi-PetaFLOP simulations are performed with SeisSol (www.seissol.org), an open-source software package based on an ADER-Discontinuous Galerkin (DG) scheme solving the seismic wave equations in velocity-stress formulation in elastic, viscoelastic, and viscoplastic media with high-order accuracy in time and space. Our flux-based implementation of frictional failure remains free of spurious oscillations. Tetrahedral unstructured meshes allow for complicated model geometry. SeisSol has been optimized on all software levels, including: assembler-level DG kernels which obtain 50% peak performance on some of the largest supercomputers worldwide; an overlapping MPI-OpenMP parallelization shadowing the multiphysics computations; usage of local time stepping; parallel input and output schemes and direct interfaces to community standard data formats. All these factors enable aim to minimise the time-to-solution. The results presented highlight the fact that modern numerical methods and hardware-aware optimization for modern supercomputers are essential to further our understanding of earthquake source physics and complement both physic-based ground motion research and empirical approaches in seismic hazard analysis. Lastly, we will conclude with an outlook on future exascale ADER-DG solvers for seismological applications.

  15. Simulation-based assessment of the impact of fertiliser and herbicide application on freshwater ecosystems at the Three Gorges Reservoir in China.

    PubMed

    Scholz-Starke, Björn; Bo, Li; Holbach, Andreas; Norra, Stefan; Floehr, Tilman; Hollert, Henner; Roß-Nickoll, Martina; Schäffer, Andreas; Ottermanns, Richard

    2018-05-20

    Dams have profound impacts on river ecosystems, amongst them inundation of land, altered dynamics of the water body or uprising reservoir backwaters influencing tributary or upstream river sections. Along the outstandingly ecologically important Yangtze River in China, the Three Gorges Reservoir (TGR) is the largest project, covering an area of 1080 km 2 . From the beginning, the dam-project came in for criticism on increasing environmental risks due to sub-merging former industrial and urban areas. We simulated dynamics of biotic and abiotic components of the TGR ecosystem (trophic guilds of aquatic organisms, hydrodynamics, nutrients), as well as the behaviour of the herbicidal substance propanil and its metabolites 3,4-Dichloroaniline (DCA) and 3,3',4,4'-tetrachloroazoxybenzene (TCAB). A modelling environment, provided by the AQUATOX software, was adapted to the specific situation at a tributary reach to the Yangtze river 'Daning River'. As the simulated food web contained several interconnected trophic levels, a significant biomagnification of metabolites was demonstrated by our simulation studies. In particular, newly emerging stagnant downstream sections of tributaries exhibited high probabilities due to accumulating pesticides from upstream sources. The common problem of algal blooms in the TGR-region was addressed by dose-response simulation experiments with essential nutrients. Impacts on structure and abundance of populations of aquatic organisms were shown. However, even high nutrient loads resulted in only slight changes of densities of organisms of all trophic levels. Nevertheless, the probabilities for large-scale algal blooms affecting drinking water quality were considered low because of high flow velocities and discharge rates towards the Yangtze River. We see high potential of simulation-based assessments that provide information for risk managers dealing with whole catchment areas. They are put in the position to differentiate the magnitude of impacts of various factors and decide about the most effective remediation measures. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Modeling Materials: Design for Planetary Entry, Electric Aircraft, and Beyond

    NASA Technical Reports Server (NTRS)

    Thompson, Alexander; Lawson, John W.

    2014-01-01

    NASA missions push the limits of what is possible. The development of high-performance materials must keep pace with the agency's demanding, cutting-edge applications. Researchers at NASA's Ames Research Center are performing multiscale computational modeling to accelerate development times and further the design of next-generation aerospace materials. Multiscale modeling combines several computationally intensive techniques ranging from the atomic level to the macroscale, passing output from one level as input to the next level. These methods are applicable to a wide variety of materials systems. For example: (a) Ultra-high-temperature ceramics for hypersonic aircraft-we utilized the full range of multiscale modeling to characterize thermal protection materials for faster, safer air- and spacecraft, (b) Planetary entry heat shields for space vehicles-we computed thermal and mechanical properties of ablative composites by combining several methods, from atomistic simulations to macroscale computations, (c) Advanced batteries for electric aircraft-we performed large-scale molecular dynamics simulations of advanced electrolytes for ultra-high-energy capacity batteries to enable long-distance electric aircraft service; and (d) Shape-memory alloys for high-efficiency aircraft-we used high-fidelity electronic structure calculations to determine phase diagrams in shape-memory transformations. Advances in high-performance computing have been critical to the development of multiscale materials modeling. We used nearly one million processor hours on NASA's Pleiades supercomputer to characterize electrolytes with a fidelity that would be otherwise impossible. For this and other projects, Pleiades enables us to push the physics and accuracy of our calculations to new levels.

  17. Event-driven simulation in SELMON: An overview of EDSE

    NASA Technical Reports Server (NTRS)

    Rouquette, Nicolas F.; Chien, Steve A.; Charest, Leonard, Jr.

    1992-01-01

    EDSE (event-driven simulation engine), a model-based event-driven simulator implemented for SELMON, a tool for sensor selection and anomaly detection in real-time monitoring is described. The simulator is used in conjunction with a causal model to predict future behavior of the model from observed data. The behavior of the causal model is interpreted as equivalent to the behavior of the physical system being modeled. An overview of the functionality of the simulator and the model-based event-driven simulation paradigm on which it is based is provided. Included are high-level descriptions of the following key properties: event consumption and event creation, iterative simulation, synchronization and filtering of monitoring data from the physical system. Finally, how EDSE stands with respect to the relevant open issues of discrete-event and model-based simulation is discussed.

  18. Cold Gas in High-z Galaxies: The CO Gas Excitation Ladder and the need for the ngVLA

    NASA Astrophysics Data System (ADS)

    Casey, Caitlin M.; Champagne, Jaclyn; Narayanan, Desika; Davé, Romeel; Hung, Chao-Ling; Carilli, Chris; Murphy, Eric Joseph; Decarli, Roberto; Popping, Gergo; Riechers, Dominik A.; Somerville, Rachel; Walter, Fabian

    2018-01-01

    We will present updated results on a community study led to understand the observable molecular gas properties of high-z galaxies. This work uses a series of high-resolution, hydrodynamic, cosmological zoom-in simulations from MUFASA, the Despotic radiative transfer code that uses simultaneous thermal and statistical equilibrium in calculating molecular and atomic level populations, and a CASA simulator which generates mock ngVLA and ALMA observations. Our work reveals a stark contrast in gas characteristics (geometry and kinematics) as measured from low-J transitions of CO to high-J transitions, demonstrating the need for the ngVLA in probing the cold gas reservoir in the highest-redshift galaxies.

  19. What have we learned from the German consortium project STORM aiming at high-resolution climate simulations?

    NASA Astrophysics Data System (ADS)

    von Storch, Jin-Song

    2014-05-01

    The German consortium STORM was built to explore high-resolution climate simulations using the high-performance computer stored at the German Climate Computer Center (DKRZ). One of the primary goals is to quantify the effect of unresolved (and parametrized) processes on climate sensitivity. We use ECHAM6/MPIOM, the coupled atmosphere-ocean model developed at the Max-Planck Institute for Meteorology. The resolution is T255L95 for the atmosphere and 1/10 degree and 80 vertical levels for the ocean. We discuss results of stand-alone runs, i.e. the ocean-only simulation driven by the NCEP/NCAR renalaysis and the atmosphere-only AMIP-type of simulation. Increasing resolution leads to a redistribution of biases, even though some improvements, both in the atmosphere and in the ocean, can clearly be attributed to the increase in resolution. We represent also new insights on ocean meso-scale eddies, in particular their effects on the ocean's energetics. Finally, we discuss the status and problems of the coupled high-resolution runs.

  20. Local Structures of High-Entropy Alloys (HEAs) on Atomic Scales: An Overview

    DOE PAGES

    Diao, Haoyan; Santodonato, Louis J.; Tang, Zhi; ...

    2015-08-29

    The high-entropy alloys (HEAs), containing several elements mixed in equimolar or near-equimolar ratios, have shown exceptional engineering properties. Local structures on atomic level are essential to understand the mechanical behaviors and related mechanisms. In this paper, the local structure and stress on the atomic level are reviewed by the pair-distribution function (PDF) of neutron-diffraction data, ab-initio-molecular-dynamics (AIMD) simulations, and atomic-probe microscopy (APT).

  1. Stayin Alive: What are Persistent Synthetic Environments

    DTIC Science & Technology

    2014-10-01

    simulated entities are discussed in the context of their persistence and requirements. 1. Background A common high -level requirement that shows up...perspective this high -level requirement is problematic. As with DRDC RDDC 2014 P41 M&S terminology of ‘simulation’, ‘model’ or ‘terrain’, the word...under consideration be used. Any sort of complete treatment of PSE’s is clearly beyond the scope of this paper, however, to illustrate the technique

  2. Numerical simulation and characterization of trapping noise in InGaP-GaAs heterojunctions devices at high injection

    NASA Astrophysics Data System (ADS)

    Nallatamby, Jean-Christophe; Abdelhadi, Khaled; Jacquet, Jean-Claude; Prigent, Michel; Floriot, Didier; Delage, Sylvain; Obregon, Juan

    2013-03-01

    Commercially available simulators present considerable advantages in performing accurate DC, AC and transient simulations of semiconductor devices, including many fundamental and parasitic effects which are not generally taken into account in house-made simulators. Nevertheless, while the TCAD simulators of the public domain we have tested give accurate results for the simulation of diffusion noise, none of the tested simulators perform trap-assisted GR noise accurately. In order to overcome the aforementioned problem we propose a robust solution to accurately simulate GR noise due to traps. It is based on numerical processing of the output data of one of the simulators available in the public-domain, namely SENTAURUS (from Synopsys). We have linked together, through a dedicated Data Access Component (DAC), the deterministic output data available from SENTAURUS and a powerful, customizable post-processing tool developed on the mathematical SCILAB software package. Thus, robust simulations of GR noise in semiconductor devices can be performed by using GR Langevin sources associated to the scalar Green functions responses of the device. Our method takes advantage of the accuracy of the deterministic simulations of electronic devices obtained with SENTAURUS. A Comparison between 2-D simulations and measurements of low frequency noise on InGaP-GaAs heterojunctions, at low as well as high injection levels, demonstrates the validity of the proposed simulation tool.

  3. Generation of a large volume of clinically relevant nanometre-sized ultra-high-molecular-weight polyethylene wear particles for cell culture studies

    PubMed Central

    Ingham, Eileen; Fisher, John; Tipper, Joanne L

    2014-01-01

    It has recently been shown that the wear of ultra-high-molecular-weight polyethylene in hip and knee prostheses leads to the generation of nanometre-sized particles, in addition to micron-sized particles. The biological activity of nanometre-sized ultra-high-molecular-weight polyethylene wear particles has not, however, previously been studied due to difficulties in generating sufficient volumes of nanometre-sized ultra-high-molecular-weight polyethylene wear particles suitable for cell culture studies. In this study, wear simulation methods were investigated to generate a large volume of endotoxin-free clinically relevant nanometre-sized ultra-high-molecular-weight polyethylene wear particles. Both single-station and six-station multidirectional pin-on-plate wear simulators were used to generate ultra-high-molecular-weight polyethylene wear particles under sterile and non-sterile conditions. Microbial contamination and endotoxin levels in the lubricants were determined. The results indicated that microbial contamination was absent and endotoxin levels were low and within acceptable limits for the pharmaceutical industry, when a six-station pin-on-plate wear simulator was used to generate ultra-high-molecular-weight polyethylene wear particles in a non-sterile environment. Different pore-sized polycarbonate filters were investigated to isolate nanometre-sized ultra-high-molecular-weight polyethylene wear particles from the wear test lubricants. The use of the filter sequence of 10, 1, 0.1, 0.1 and 0.015 µm pore sizes allowed successful isolation of ultra-high-molecular-weight polyethylene wear particles with a size range of < 100 nm, which was suitable for cell culture studies. PMID:24658586

  4. Generation of a large volume of clinically relevant nanometre-sized ultra-high-molecular-weight polyethylene wear particles for cell culture studies.

    PubMed

    Liu, Aiqin; Ingham, Eileen; Fisher, John; Tipper, Joanne L

    2014-04-01

    It has recently been shown that the wear of ultra-high-molecular-weight polyethylene in hip and knee prostheses leads to the generation of nanometre-sized particles, in addition to micron-sized particles. The biological activity of nanometre-sized ultra-high-molecular-weight polyethylene wear particles has not, however, previously been studied due to difficulties in generating sufficient volumes of nanometre-sized ultra-high-molecular-weight polyethylene wear particles suitable for cell culture studies. In this study, wear simulation methods were investigated to generate a large volume of endotoxin-free clinically relevant nanometre-sized ultra-high-molecular-weight polyethylene wear particles. Both single-station and six-station multidirectional pin-on-plate wear simulators were used to generate ultra-high-molecular-weight polyethylene wear particles under sterile and non-sterile conditions. Microbial contamination and endotoxin levels in the lubricants were determined. The results indicated that microbial contamination was absent and endotoxin levels were low and within acceptable limits for the pharmaceutical industry, when a six-station pin-on-plate wear simulator was used to generate ultra-high-molecular-weight polyethylene wear particles in a non-sterile environment. Different pore-sized polycarbonate filters were investigated to isolate nanometre-sized ultra-high-molecular-weight polyethylene wear particles from the wear test lubricants. The use of the filter sequence of 10, 1, 0.1, 0.1 and 0.015 µm pore sizes allowed successful isolation of ultra-high-molecular-weight polyethylene wear particles with a size range of < 100 nm, which was suitable for cell culture studies.

  5. Turbulence modeling and combustion simulation in porous media under high Peclet number

    NASA Astrophysics Data System (ADS)

    Moiseev, Andrey A.; Savin, Andrey V.

    2018-05-01

    Turbulence modelling in porous flows and burning still remains not completely clear until now. Undoubtedly, conventional turbulence models must work well under high Peclet numbers when porous channels shape is implemented in details. Nevertheless, the true turbulent mixing takes place at micro-scales only, and the dispersion mixing works at macro-scales almost independent from true turbulence. The dispersion mechanism is characterized by the definite space scale (scale of the porous structure) and definite velocity scale (filtration velocity). The porous structure is stochastic one usually, and this circumstance allows applying the analogy between space-time-stochastic true turbulence and the dispersion flow which is stochastic in space only, when porous flow is simulated at the macro-scale level. Additionally, the mentioned analogy allows applying well-known turbulent combustion models in simulations of porous combustion under high Peclet numbers.

  6. An equation-of-state-meter of quantum chromodynamics transition from deep learning.

    PubMed

    Pang, Long-Gang; Zhou, Kai; Su, Nan; Petersen, Hannah; Stöcker, Horst; Wang, Xin-Nian

    2018-01-15

    A primordial state of matter consisting of free quarks and gluons that existed in the early universe a few microseconds after the Big Bang is also expected to form in high-energy heavy-ion collisions. Determining the equation of state (EoS) of such a primordial matter is the ultimate goal of high-energy heavy-ion experiments. Here we use supervised learning with a deep convolutional neural network to identify the EoS employed in the relativistic hydrodynamic simulations of heavy ion collisions. High-level correlations of particle spectra in transverse momentum and azimuthal angle learned by the network act as an effective EoS-meter in deciphering the nature of the phase transition in quantum chromodynamics. Such EoS-meter is model-independent and insensitive to other simulation inputs including the initial conditions for hydrodynamic simulations.

  7. Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running

    PubMed Central

    Zhu, Hao; Sun, Yan; Rajagopal, Gunaretnam; Mondry, Adrian; Dhar, Pawan

    2004-01-01

    Background Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level. Methods We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG. Results We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given. Conclusions Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described. PMID:15339335

  8. Simulating Effects of High Angle of Attack on Turbofan Engine Performance

    NASA Technical Reports Server (NTRS)

    Liu, Yuan; Claus, Russell W.; Litt, Jonathan S.; Guo, Ten-Huei

    2013-01-01

    A method of investigating the effects of high angle of attack (AOA) flight on turbofan engine performance is presented. The methodology involves combining a suite of diverse simulation tools. Three-dimensional, steady-state computational fluid dynamics (CFD) software is used to model the change in performance of a commercial aircraft-type inlet and fan geometry due to various levels of AOA. Parallel compressor theory is then applied to assimilate the CFD data with a zero-dimensional, nonlinear, dynamic turbofan engine model. The combined model shows that high AOA operation degrades fan performance and, thus, negatively impacts compressor stability margins and engine thrust. In addition, the engine response to high AOA conditions is shown to be highly dependent upon the type of control system employed.

  9. Effect of train vibration on settlement of soil: A numerical analysis

    NASA Astrophysics Data System (ADS)

    Tiong, Kah-Yong; Ling, Felix Ngee-Leh; Talib, Zaihasra Abu

    2017-10-01

    The drastic development of transit system caused the influence of ground-borne vibrations induced by train on ground settlement became concern problem nowadays. The purpose of this study is to investigate soil settlement caused by train vibration. To facilitate this study, computer simulation of soil dynamic response using commercial finite element package - PLAXIS 2D was performed to simulate track-subgrade system together with dynamic train load under three different conditions. The results of simulation analysis established the facts that the soil deformation increased with raising in water level. This phenomenon happens because the increasing water level not only induced greater excess pore water pressure but also reduced stiffness of soil. Furthermore, the simulation analysis also deduced that the soil settlement was reduced by placing material with high stiffness between the subgrade and the ballast layer since material with high stiffness was able to dissipate energy efficiently due to its high bearing capacity, thus protecting the subgrade from deteriorating. The simulation analysis result also showed that the soil dynamic response increased with the increase in the speed of train and a noticeable amplification in soil deformation occurred as the train speed approaches the Rayleigh wave velocity of the track subgrade system. This is due to the fact that dynamic train load depend on both the self-weight of the train and the dynamic component due to inertial effects associated with the train speed. Thus, controlling the train speeds under critical velocity of track-subgrade system is able to ensure the safety of train operation as it prevents track-ground resonance and dramatic ground.

  10. A new low voltage level-shifted FVF current mirror with enhanced bandwidth and output resistance

    NASA Astrophysics Data System (ADS)

    Aggarwal, Bhawna; Gupta, Maneesha; Gupta, Anil Kumar; Sangal, Ankur

    2016-10-01

    This paper proposes a new high-performance level-shifted flipped voltage follower (LSFVF) based low-voltage current mirror (CM). The proposed CM utilises the low-supply voltage and low-input resistance characteristics of a flipped voltage follower (FVF) CM. In the proposed CM, level-shifting configuration is used to obtain a wide operating current range and resistive compensation technique is employed to increase the operating bandwidth. The peaking in frequency response is reduced by using an additional large MOSFET. Moreover, a very high output resistance (in GΩ range) along with low-current transfer error is achieved through super-cascode configuration for a wide current range (0-440 µA). Small signal analysis is carried out to show the improvements achieved at each step. The proposed CM is simulated by Mentor Graphics Eldospice in TSMC 0.18 µm CMOS, BSIM3 and Level 53 technology. In the proposed CM, a bandwidth of 6.1799 GHz, 1% settling time of 0.719 ns, input and output resistances of 21.43 Ω and 1.14 GΩ, respectively, are obtained with a single supply voltage of 1 V. The layout of the proposed CM has been designed and post-layout simulation results have been shown. The post-layout simulation results for Monte Carlo and temperature analysis have also been included to show the reliability of the CM against the variations in process parameters and temperature changes.

  11. Clouds and the extratropical circulation response to global warming in a hierarchy of global atmosphere models

    NASA Astrophysics Data System (ADS)

    Voigt, A.

    2017-12-01

    Climate models project that global warming will lead to substantial changes in extratropical jet streams. Yet, many quantitative aspects of warming-induced jet stream changes remain uncertain, and recent work has indicated an important role of clouds and their radiative interactions. Here, I will investigate how cloud-radiative changes impact the zonal-mean extratropical circulation response under global warming using a hierarchy of global atmosphere models. I will first focus on aquaplanet setups with prescribed sea-surface temperatures (SSTs), which reproduce the model spread found in realistic simulations with interactive SSTs. Simulations with two CMIP5 models MPI-ESM and IPSL-CM5A and prescribed clouds show that half of the circulation response can be attributed to cloud changes. The rise of tropical high-level clouds and the upward and poleward movement of midlatitude high-level clouds lead to poleward jet shifts. High-latitude low-level cloud changes shift the jet poleward in one model but not in the other. The impact of clouds on the jet operates via the atmospheric radiative forcing that is created by the cloud changes and is qualitatively reproduced in a dry Held-Suarez model, although the latter is too sensitive because of its simplified treatment of diabatic processes. I will then show that the aquaplanet results also hold when the models are used in a realistic setup that includes continents and seasonality. I will further juxtapose these prescribed-SST simulations with interactive-SST simulations and show that atmospheric and surface cloud-radiative interactions impact the jet poleward jet shifts in about equal measure. Finally, I will discuss the cloud impact on regional and seasonal circulation changes.

  12. Castable thermal insulation for use as heat shields

    NASA Technical Reports Server (NTRS)

    Mountvala, A. J.; Nakamura, H. H.; Rechter, H. L.

    1974-01-01

    Structural members supporting the afterburners of high thrust rocket engines are subjected to extreme heating, along with severe vibration and high acceleration levels during early lift-off. Chemically-bonded, castable, zircon composite foams were developed and successfully tested to meet specific, laboratory simulated lift-off conditions.

  13. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    NASA Astrophysics Data System (ADS)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  14. Research on simulation system with the wide range and high-precision laser energy characteristics

    NASA Astrophysics Data System (ADS)

    Dong, Ke-yan; Lou, Yan; He, Jing-yi; Tong, Shou-feng; Jiang, Hui-lin

    2012-10-01

    The Hardware-in-the-loop(HWIL) simulation test is one of the important parts for the development and performance testing of semi-active laser-guided weapons. In order to obtain accurate results, the confidence level of the target environment should be provided for a high-seeker during the HWIL simulation test of semi-active laser-guided weapons, and one of the important simulation parameters is the laser energy characteristic. In this paper, based on the semi-active laser-guided weapon guidance principles, an important parameter of simulation of confidence which affects energy characteristics in performance test of HWIL simulation was analyzed. According to the principle of receiving the same energy by using HWIL simulation and in practical application, HWIL energy characteristics simulation systems with the crystal absorption structure was designed. And on this basis, the problems of optimal design of the optical system were also analyzed. The measured results show that the dynamic attenuation range of the system energy is greater than 50dB, the dynamic attenuation stability is less than 5%, and the maximum energy changing rate driven by the servo motor is greater than 20dB/s.

  15. ProjectQ Software Framework

    NASA Astrophysics Data System (ADS)

    Steiger, Damian S.; Haener, Thomas; Troyer, Matthias

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.

  16. Creation and Delphi-method refinement of pediatric disaster triage simulations.

    PubMed

    Cicero, Mark X; Brown, Linda; Overly, Frank; Yarzebski, Jorge; Meckler, Garth; Fuchs, Susan; Tomassoni, Anthony; Aghababian, Richard; Chung, Sarita; Garrett, Andrew; Fagbuyi, Daniel; Adelgais, Kathleen; Goldman, Ran; Parker, James; Auerbach, Marc; Riera, Antonio; Cone, David; Baum, Carl R

    2014-01-01

    There is a need for rigorously designed pediatric disaster triage (PDT) training simulations for paramedics. First, we sought to design three multiple patient incidents for EMS provider training simulations. Our second objective was to determine the appropriate interventions and triage level for each victim in each of the simulations and develop evaluation instruments for each simulation. The final objective was to ensure that each simulation and evaluation tool was free of bias toward any specific PDT strategy. We created mixed-methods disaster simulation scenarios with pediatric victims: a school shooting, a school bus crash, and a multiple-victim house fire. Standardized patients, high-fidelity manikins, and low-fidelity manikins were used to portray the victims. Each simulation had similar acuity of injuries and 10 victims. Examples include children with special health-care needs, gunshot wounds, and smoke inhalation. Checklist-based evaluation tools and behaviorally anchored global assessments of function were created for each simulation. Eight physicians and paramedics from areas with differing PDT strategies were recruited as Subject Matter Experts (SMEs) for a modified Delphi iterative critique of the simulations and evaluation tools. The modified Delphi was managed with an online survey tool. The SMEs provided an expected triage category for each patient. The target for modified Delphi consensus was ≥85%. Using Likert scales and free text, the SMEs assessed the validity of the simulations, including instances of bias toward a specific PDT strategy, clarity of learning objectives, and the correlation of the evaluation tools to the learning objectives and scenarios. After two rounds of the modified Delphi, consensus for expected triage level was >85% for 28 of 30 victims, with the remaining two achieving >85% consensus after three Delphi iterations. To achieve consensus, we amended 11 instances of bias toward a specific PDT strategy and corrected 10 instances of noncorrelation between evaluations and simulation. The modified Delphi process, used to derive novel PDT simulation and evaluation tools, yielded a high degree of consensus among the SMEs, and eliminated biases toward specific PDT strategies in the evaluations. The simulations and evaluation tools may now be tested for reliability and validity as part of a prehospital PDT curriculum.

  17. 1D-3D hybrid modeling-from multi-compartment models to full resolution models in space and time.

    PubMed

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator-which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics.

  18. SWOT Oceanography and Hydrology Data Product Simulators

    NASA Technical Reports Server (NTRS)

    Peral, Eva; Rodriguez, Ernesto; Fernandez, Daniel Esteban; Johnson, Michael P.; Blumstein, Denis

    2013-01-01

    The proposed Surface Water and Ocean Topography (SWOT) mission would demonstrate a new measurement technique using radar interferometry to obtain wide-swath measurements of water elevation at high resolution over ocean and land, addressing the needs of both the hydrology and oceanography science communities. To accurately evaluate the performance of the proposed SWOT mission, we have developed several data product simulators at different levels of fidelity and complexity.

  19. Massive quantum regions for simulations on bio-nanomaterials: synthetic ferritin nanocages.

    PubMed

    Torras, Juan; Alemán, Carlos

    2018-02-22

    QM/MM molecular dynamics simulations on the 4His-ΔC* protein cage have been performed using multiple active zones (up to 86 quantum regions). The regulation and nanocage stability exerted by the divalent transition metal ions in the monomer-to-cage conversion have been understood by comparing high level quantum trajectories obtained using Cu 2+ and Ni 2+ coordination ions.

  20. Fluid-structure interaction simulation of floating structures interacting with complex, large-scale ocean waves and atmospheric turbulence with application to floating offshore wind turbines

    NASA Astrophysics Data System (ADS)

    Calderer, Antoni; Guo, Xin; Shen, Lian; Sotiropoulos, Fotis

    2018-02-01

    We develop a numerical method for simulating coupled interactions of complex floating structures with large-scale ocean waves and atmospheric turbulence. We employ an efficient large-scale model to develop offshore wind and wave environmental conditions, which are then incorporated into a high resolution two-phase flow solver with fluid-structure interaction (FSI). The large-scale wind-wave interaction model is based on a two-fluid dynamically-coupled approach that employs a high-order spectral method for simulating the water motion and a viscous solver with undulatory boundaries for the air motion. The two-phase flow FSI solver is based on the level set method and is capable of simulating the coupled dynamic interaction of arbitrarily complex bodies with airflow and waves. The large-scale wave field solver is coupled with the near-field FSI solver with a one-way coupling approach by feeding into the latter waves via a pressure-forcing method combined with the level set method. We validate the model for both simple wave trains and three-dimensional directional waves and compare the results with experimental and theoretical solutions. Finally, we demonstrate the capabilities of the new computational framework by carrying out large-eddy simulation of a floating offshore wind turbine interacting with realistic ocean wind and waves.

  1. Use of Baby Isao Simulator and Standardized Parents in Hearing Screening and Parent Counseling Education.

    PubMed

    Alanazi, Ahmad A; Nicholson, Nannette; Atcherson, Samuel R; Franklin, Clifford; Anders, Michael; Nagaraj, Naveen; Franklin, Jennifer; Highley, Patricia

    2016-09-01

    The primary purpose of this study was to test the effect of the combined use of trained standardized parents and a baby simulator on students' hearing screening and parental counseling knowledge and skills. A one-group pretest-posttest quasi-experimental study design was used to assess self-ratings of confidence in knowledge and skills and satisfaction of the educational experience with standardized parents and a baby simulator. The mean age of the 14 audiology students participating in this study was 24.79 years (SD = 1.58). Participants completed a pre- and postevent questionnaire in which they rated their level of confidence for specific knowledge and skills. Six students (2 students in each scenario) volunteered to participate in the infant hearing screening and counseling scenarios, whereas others participated as observers. All participants participated in the briefing and debriefing sessions immediately before and after each of 3 scenarios. After the last scenario, participants were asked to complete a satisfaction survey of their learning experience using simulation and standardized parents. Overall, the pre- and post-simulation event questionnaire revealed a significant improvement in the participants' self-rated confidence levels regarding knowledge and skills. The mean difference between pre- and postevent scores was 0.52 (p < .01). The mean satisfaction level was 4.71 (range = 3.91-5.00; SD = 0.30) based on a Likert scale, where 1 = not satisfied and 5 = very satisfied. The results of this novel educational activity demonstrate the value of using infant hearing screening and parental counseling simulation sessions to enhance student learning. In addition, this study demonstrates the use of simulation and standardized parents as an important pedagogical tool for audiology students. Students experienced a high level of satisfaction with the learning experience.

  2. EEG and ECG changes during simulator operation reflect mental workload and vigilance.

    PubMed

    Dussault, Caroline; Jouanin, Jean-Claude; Philippe, Matthieu; Guezennec, Charles-Yannick

    2005-04-01

    Performing mission tasks in a simulator influences many neurophysiological measures. Quantitative assessments of electroencephalography (EEG) and electrocardiography (ECG) have made it possible to develop indicators of mental workload and to estimate relative physiological responses to cognitive requirements. To evaluate the effects of mental workload without actual physical risk, we studied the cortical and cardiovascular changes that occurred during simulated flight. There were 12 pilots (8 novices and 4 experts) who simulated a flight composed of 10 sequences that induced several different mental workload levels. EEG was recorded at 12 electrode sites during rest and flight sequences; ECG activity was also recorded. Subjective tests were used to evaluate anxiety and vigilance levels. Theta band activity was lower during the two simulated flight rest sequences than during visual and instrument flight sequences at central, parietal, and occipital sites (p < 0.05). On the other hand, rest sequences resulted in higher beta (at the C4 site; p < 0.05) and gamma (at the central, parietal, and occipital sites; p < 0.05) power than active segments. The mean heart rate (HR) was not significantly different during any simulated flight sequence, but HR was lower for expert subjects than for novices. The subjective tests revealed no significant anxiety and high values for vigilance levels before and during flight. The different flight sequences performed on the simulator resulted in electrophysiological changes that expressed variations in mental workload. These results corroborate those found during study of real flights, particularly during sequences requiring the heaviest mental workload.

  3. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  4. Cascaded Converters for Integration and Management of Grid Level Energy Storage Systems

    NASA Astrophysics Data System (ADS)

    Alaas, Zuhair

    This research work proposes two cascaded multilevel inverter structures for BESS. The gating and switching control of switching devices in both inverter typologies are done by using a phase-shifted PWM scheme. The first proposed isolated multilevel inverter is made up of three-phase six-switch inverter blocks with a reduced number of power components compared with traditional isolated CHB. The suggested isolated converter has only one battery string for three-phase system that can be used for high voltage and high power applications such as grid connected BESS and alternative energy systems. The isolated inverter enables dq frame based simple control and eliminates the issues of single-phase pulsating power, which can cause detrimental impacts on certain dc sources. Simulation studies have been carried out to compare the proposed isolated multi-level inverter with an H-bridge cascaded transformer inverter. The simulation results verified the performance of the isolated inverter. The second proposed topology is a Hierarchal Cascaded Multilevel Converter (HCMC) with phase to phase SOC balancing capability which also for high voltage and high power battery energy storage systems. The HCMC has a hybrid structure of half-bridge converters and H-bridge inverters and the voltage can be hierarchically cascaded to reach the desired value at the half-bridge and the H-bridge levels. The uniform SOC battery management is achieved by controlling the half-bridge converters that are connected to individual battery modules/cells. Simulation studies and experimental results have been carried on a large scale battery system under different operating conditions to verify the effectiveness of the proposed inverters. Moreover, this dissertation presents a new three-phase SOC equalizing circuit, called six-switch energy-level balancing circuit (SSBC), which can be used to realize uniform SOC operation for full utilization of the battery capacity in proposed HCMC or any CMI inverter while keeping balanced three-phase operation. A sinusoidal PWM modulation technique is used to control power transferring between phases. Simulation results have been carried out to verify the performance of the proposed SSBC circuit of uniform three-phase SOC balancing.

  5. High Severity Wildfire Effect On Rainfall Infiltration And Runoff: A Cellular Automata Based Simulation

    NASA Astrophysics Data System (ADS)

    Vergara-Blanco, J. E.; Leboeuf-Pasquier, J.; Benavides-Solorio, J. D. D.

    2017-12-01

    A simulation software that reproduces rainfall infiltration and runoff for a storm event in a particular forest area is presented. A cellular automaton is utilized to represent space and time. On the time scale, the simulation is composed by a sequence of discrete time steps. On the space scale, the simulation is composed of forest surface cells. The software takes into consideration rain intensity and length, individual forest cell soil absorption capacity evolution, and surface angle of inclination. The software is developed with the C++ programming language. The simulation is executed on a 100 ha area within La Primavera Forest in Jalisco, Mexico. Real soil texture for unburned terrain and high severity wildfire affected terrain is employed to recreate the specific infiltration profile. Historical rainfall data of a 92 minute event is used. The Horton infiltration equation is utilized for infiltration capacity calculation. A Digital Elevation Model (DEM) is employed to reproduce the surface topography. The DEM is displayed with a 3D mesh graph where individual surface cells can be observed. The plot colouring renders water content development at the cell level throughout the storm event. The simulation shows that the cumulative infiltration and runoff which take place at the surface cell level depend on the specific storm intensity, fluctuation and length, overall terrain topography, cell slope, and soil texture. Rainfall cumulative infiltration for unburned and high severity wildfire terrain are compared: unburned terrain exhibits a significantly higher amount of rainfall infiltration.It is concluded that a cellular automaton can be utilized with a C++ program to reproduce rainfall infiltration and runoff under diverse soil texture, topographic and rainfall conditions in a forest setting. This simulation is geared for an optimization program to pinpoint the locations of a series of forest land remediation efforts to support reforestation or to minimize runoff.

  6. Sensitivity of Hurricane Storm Surge to Land Cover and Topography Under Various Sea Level Rise Scenarios Along the Mississippi Coast

    NASA Astrophysics Data System (ADS)

    Bilskie, M. V.; Hagen, S. C.; Medeiros, S. C.

    2013-12-01

    Major Gulf hurricanes have a high probability of impacting the northern Gulf of Mexico, especially coastal Mississippi (Resio, 2007). Due to the wide and flat continental shelf, this area provides near-perfect geometry for high water levels under tropical cyclone conditions. Literature suggests with 'very high confidence that global sea level will rise at least 0.2 m and no more than 2.0 m by 2011' (Donoghue, 2011; Parris et al., 2012). Further, it is recognized that the Mississippi barrier islands are highly susceptible to a westward migration and retreating shoreline. With predictions for less frequent, more intense tropical storms, rising sea levels, and a changing landscape, it is important to understand how these changes may affect inundation extent and flooding due to hurricane storm surge. A state-of-the-art SWAN+ADCIRC hydrodynamic model of coastal Mississippi was utilized to simulate Hurricane Katrina with present day sea level conditions. Using present day as a base scenario, past (1960) and future (2050) sea level changes were simulated. In addition to altering the initial sea state, land use land cover (LULC) was modified for 1960 and 2050 based on historic data and future projections. LULC datasets are used to derive surface roughness characteristics, such as Manning's n, and wind reduction factors. The topography along the barrier islands and near the Pascagoula River, MS was also altered to reflect the 1960 landscape. Storm surge sensitivity to topographic change were addressed by comparing model results between two 1960 storm surge simulations; one with current topography and a second with changes to the barrier islands. In addition, model responses to changes in LULC are compared. The results will be used to gain insight into adapting present day storm surge models for future conditions. References Donoghue, J. (2011). Sea level history of the northern Gulf of Mexico coast and sea level rise scenarios for the near future. Climatic Change, 107(1-2), 17-33. doi: 10.1007/s10584-011-0077-x Parris, A., Bromirski, P., Burkett, V., Cayan, D., Culver, M., Hall, J., . . . Weiss, J. (2012). Global Sea Level Rise Scenarios for the United States National Climate Assessment NOAA Tech Memo OAR CPO-1 (pp. 37). Resio, D. T. (2007). White paper on estimating hurricane inundation probabilities (pp. 125). Vicksburg, MS: U.S. Army Engineering Research and Development Center.

  7. GOCE gravity field simulation based on actual mission scenario

    NASA Astrophysics Data System (ADS)

    Pail, R.; Goiginger, H.; Mayrhofer, R.; Höck, E.; Schuh, W.-D.; Brockmann, J. M.; Krasbutter, I.; Fecher, T.; Gruber, T.

    2009-04-01

    In the framework of the ESA-funded project "GOCE High-level Processing Facility", an operational hardware and software system for the scientific processing (Level 1B to Level 2) of GOCE data has been set up by the European GOCE Gravity Consortium EGG-C. One key component of this software system is the processing of a spherical harmonic Earth's gravity field model and the corresponding full variance-covariance matrix from the precise GOCE orbit and calibrated and corrected satellite gravity gradiometry (SGG) data. In the framework of the time-wise approach a combination of several processing strategies for the optimum exploitation of the information content of the GOCE data has been set up: The Quick-Look Gravity Field Analysis is applied to derive a fast diagnosis of the GOCE system performance and to monitor the quality of the input data. In the Core Solver processing a rigorous high-precision solution of the very large normal equation systems is derived by applying parallel processing techniques on a PC cluster. Before the availability of real GOCE data, by means of a realistic numerical case study, which is based on the actual GOCE orbit and mission scenario and simulation data stemming from the most recent ESA end-to-end simulation, the expected GOCE gravity field performance is evaluated. Results from this simulation as well as recently developed features of the software system are presented. Additionally some aspects on data combination with complementary data sources are addressed.

  8. Integration of High-resolution Data for Temporal Bone Surgical Simulations

    PubMed Central

    Wiet, Gregory J.; Stredney, Don; Powell, Kimerly; Hittle, Brad; Kerwin, Thomas

    2016-01-01

    Purpose To report on the state of the art in obtaining high-resolution 3D data of the microanatomy of the temporal bone and to process that data for integration into a surgical simulator. Specifically, we report on our experience in this area and discuss the issues involved to further the field. Data Sources Current temporal bone image acquisition and image processing established in the literature as well as in house methodological development. Review Methods We reviewed the current English literature for the techniques used in computer-based temporal bone simulation systems to obtain and process anatomical data for use within the simulation. Search terms included “temporal bone simulation, surgical simulation, temporal bone.” Articles were chosen and reviewed that directly addressed data acquisition and processing/segmentation and enhancement with emphasis given to computer based systems. We present the results from this review in relationship to our approach. Conclusions High-resolution CT imaging (≤100μm voxel resolution), along with unique image processing and rendering algorithms, and structure specific enhancement are needed for high-level training and assessment using temporal bone surgical simulators. Higher resolution clinical scanning and automated processes that run in efficient time frames are needed before these systems can routinely support pre-surgical planning. Additionally, protocols such as that provided in this manuscript need to be disseminated to increase the number and variety of virtual temporal bones available for training and performance assessment. PMID:26762105

  9. Computer simulation of multiple pilots flying a modern high performance helicopter

    NASA Technical Reports Server (NTRS)

    Zipf, Mark E.; Vogt, William G.; Mickle, Marlin H.; Hoelzeman, Ronald G.; Kai, Fei; Mihaloew, James R.

    1988-01-01

    A computer simulation of a human response pilot mechanism within the flight control loop of a high-performance modern helicopter is presented. A human response mechanism, implemented by a low order, linear transfer function, is used in a decoupled single variable configuration that exploits the dominant vehicle characteristics by associating cockpit controls and instrumentation with specific vehicle dynamics. Low order helicopter models obtained from evaluations of the time and frequency domain responses of a nonlinear simulation model, provided by NASA Lewis Research Center, are presented and considered in the discussion of the pilot development. Pilot responses and reactions to test maneuvers are presented and discussed. Higher level implementation, using the pilot mechanisms, are discussed and considered for their use in a comprehensive control structure.

  10. Simulations of exercise and brain effects of acute exposure to carbon monoxide in normal and vascular-diseased persons.

    EPA Science Inventory

    At some level, carboxyhemoglobin (RbCO) due to inhalation of carbon monoxide (CO) reduces maximum exercise duration in normal and ischemic heart patients. At high RbCO levels in normal subjects, brain function is also affected and behavioral performance is impaired. These are fin...

  11. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.

    2017-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  12. Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure Validation Simulation Study

    NASA Technical Reports Server (NTRS)

    Murdoch, Jennifer L.; Bussink, Frank J. L.; Chamberlain, James P.; Chartrand, Ryan C.; Palmer, Michael T.; Palmer, Susan O.

    2008-01-01

    The Enhanced Oceanic Operations Human-In-The-Loop In-Trail Procedure (ITP) Validation Simulation Study investigated the viability of an ITP designed to enable oceanic flight level changes that would not otherwise be possible. Twelve commercial airline pilots with current oceanic experience flew a series of simulated scenarios involving either standard or ITP flight level change maneuvers and provided subjective workload ratings, assessments of ITP validity and acceptability, and objective performance measures associated with the appropriate selection, request, and execution of ITP flight level change maneuvers. In the majority of scenarios, subject pilots correctly assessed the traffic situation, selected an appropriate response (i.e., either a standard flight level change request, an ITP request, or no request), and executed their selected flight level change procedure, if any, without error. Workload ratings for ITP maneuvers were acceptable and not substantially higher than for standard flight level change maneuvers, and, for the majority of scenarios and subject pilots, subjective acceptability ratings and comments for ITP were generally high and positive. Qualitatively, the ITP was found to be valid and acceptable. However, the error rates for ITP maneuvers were higher than for standard flight level changes, and these errors may have design implications for both the ITP and the study's prototype traffic display. These errors and their implications are discussed.

  13. Simulation of local convective rainfall over metropolitan area on 16 August 2015 using high resolution model

    NASA Astrophysics Data System (ADS)

    Lee, Y. H.; Min, K. H.

    2017-12-01

    We investigated the ability of high-resolution numerical weather prediction (NWP) model (nested grid spacing at 500 m) in simulating convective precipitation event over the Seoul metropolitan area on 16 August 2015. Intense rainfall occurred from 0930 UTC to 1030 UTC and subsequent trailing precipitation lasted until 1400 UTC. The synoptic condition for the convective event was characterized by a large value of convective available potential energy (CAPE) at the outer edge of a meso-high pressure system. Observational analysis showed that triggering mechanism for convective rainfall was provided by the convergence of northeasterly wind which was driven by a cold pool in the northeastern Kyonggi province. The cold pool formed after heavy rain occurred in northeastern Kyonggi province at 0500UTC. Several experiments were performed in order to evaluate the sensitivity of different initial conditions (IC12, IC18, IC00, IC06) and the impact of data assimilation (IC06A) on simulating the convective event. The quantitative precipitation forecasts (QPF) appeared to vary widely among the experiments, depending on the timing of ICs that were chosen. QPF amount was underestimated in all experiments when data assimilation was not performed. Among the four experiments, QPF amounts and locations were better simulated in the 1200 UTC 15 August (IC12) run due to large values of CAPE in late afternoon and the presence of low-level convergence zone in the metropolitan area. Although 0600 UTC 16 August (IC06) run simulated the largest CAPE in late afternoon, the location and amount of heavy rainfall were significantly different from observations. IC06 did not simulate the convergence of low-level wind associated with the mesoscale cold pool. However, when assimilation of surface observations and radar data at 0600 UTC was performed (IC06A), the simulation reproduced the location and amount of rainfall reasonably well, indicating that high-resolution NWP model with data assimilation can predict the local convective precipitation event with a short-life time (1 3 hours) effectively within 6 hours.

  14. Measurement and numerical simulation of high intensity focused ultrasound field in water

    NASA Astrophysics Data System (ADS)

    Lee, Kang Il

    2017-11-01

    In the present study, the acoustic field of a high intensity focused ultrasound (HIFU) transducer in water was measured by using a commercially available needle hydrophone intended for HIFU use. To validate the results of hydrophone measurements, numerical simulations of HIFU fields were performed by integrating the axisymmetric Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation from the frequency-domain perspective with the help of a MATLAB-based software package developed for HIFU simulation. Quantitative values for the focal waveforms, the peak pressures, and the size of the focal spot were obtained in various regimes of linear, quasilinear, and nonlinear propagation up to the source pressure levels when the shock front was formed in the waveform. The numerical results with the HIFU simulator solving the KZK equation were compared with the experimental data and found to be in good agreement. This confirms that the numerical simulation based on the KZK equation is capable of capturing the nonlinear pressure field of therapeutic HIFU transducers well enough to make it suitable for HIFU treatment planning.

  15. Flood simulation and verification with IoT sensors

    NASA Astrophysics Data System (ADS)

    Chang, Che-Hao; Hsu, Chih-Tsung; Wu, Shiang-Jen; Huang, Sue-Wei

    2017-04-01

    2D flood dynamic simulation is a vivid tool to demonstrate the possible expose area that sustain impact of high rise of water level. Along with progress in high resolution digital terrain model, the simulation results are quite convinced yet not proved to be close to what is really happened. Due to the dynamic and uncertain essence, the expose area usually could not be well defined during a flood event. Recent development in IoT sensors bring a low power and long distance communication which help us to collect real time flood depths. With these time series of flood depths at different locations, we are capable of verifying the simulation results corresponding to the flood event. 16 flood gauges with IoT specification as well as two flood events in Annan district, Tainan city, Taiwan are examined in this study. During the event in 11, June, 2016, 12 flood gauges works well and 8 of them provide observation match to simulation.

  16. A Simulation Model Articulation of the REA Ontology

    NASA Astrophysics Data System (ADS)

    Laurier, Wim; Poels, Geert

    This paper demonstrates how the REA enterprise ontology can be used to construct simulation models for business processes, value chains and collaboration spaces in supply chains. These models support various high-level and operational management simulation applications, e.g. the analysis of enterprise sustainability and day-to-day planning. First, the basic constructs of the REA ontology and the ExSpect modelling language for simulation are introduced. Second, collaboration space, value chain and business process models and their conceptual dependencies are shown, using the ExSpect language. Third, an exhibit demonstrates the use of value chain models in predicting the financial performance of an enterprise.

  17. Development of capability for microtopography-resolving simulations of hydrologic processes in permafrost affected regions

    NASA Astrophysics Data System (ADS)

    Painter, S.; Moulton, J. D.; Berndt, M.; Coon, E.; Garimella, R.; Lewis, K. C.; Manzini, G.; Mishra, P.; Travis, B. J.; Wilson, C. J.

    2012-12-01

    The frozen soils of the Arctic and subarctic regions contain vast amounts of stored organic carbon. This carbon is vulnerable to release to the atmosphere as temperatures warm and permafrost degrades. Understanding the response of the subsurface and surface hydrologic system to degrading permafrost is key to understanding the rate, timing, and chemical form of potential carbon releases to the atmosphere. Simulating the hydrologic system in degrading permafrost regions is challenging because of the potential for topographic evolution and associated drainage network reorganization as permafrost thaws and massive ground ice melts. The critical process models required for simulating hydrology include subsurface thermal hydrology of freezing/thawing soils, thermal processes within ice wedges, mechanical deformation processes, overland flow, and surface energy balances including snow dynamics. A new simulation tool, the Arctic Terrestrial Simulator (ATS), is being developed to simulate these coupled processes. The computational infrastructure must accommodate fully unstructured grids that track evolving topography, allow accurate solutions on distorted grids, provide robust and efficient solutions on highly parallel computer architectures, and enable flexibility in the strategies for coupling among the various processes. The ATS is based on Amanzi (Moulton et al. 2012), an object-oriented multi-process simulator written in C++ that provides much of the necessary computational infrastructure. Status and plans for the ATS including major hydrologic process models and validation strategies will be presented. Highly parallel simulations of overland flow using high-resolution digital elevation maps of polygonal patterned ground landscapes demonstrate the feasibility of the approach. Simulations coupling three-phase subsurface thermal hydrology with a simple thaw-induced subsidence model illustrate the strong feedbacks among the processes. D. Moulton, M. Berndt, M. Day, J. Meza, et al., High-Level Design of Amanzi, the Multi-Process High Performance Computing Simulator, Technical Report ASCEM-HPC-2011-03-1, DOE Environmental Management, 2012.

  18. A Multi-Institutional Simulation Boot Camp for Pediatric Cardiac Critical Care Nurse Practitioners.

    PubMed

    Brown, Kristen M; Mudd, Shawna S; Hunt, Elizabeth A; Perretta, Julianne S; Shilkofski, Nicole A; Diddle, J Wesley; Yurasek, Gregory; Bembea, Melania; Duval-Arnould, Jordan; Nelson McMillan, Kristen

    2018-06-01

    Assess the effect of a simulation "boot camp" on the ability of pediatric nurse practitioners to identify and treat a low cardiac output state in postoperative patients with congenital heart disease. Additionally, assess the pediatric nurse practitioners' confidence and satisfaction with simulation training. Prospective pre/post interventional pilot study. University simulation center. Thirty acute care pediatric nurse practitioners from 13 academic medical centers in North America. We conducted an expert opinion survey to guide curriculum development. The curriculum included didactic sessions, case studies, and high-fidelity simulation, based on high-complexity cases, congenital heart disease benchmark procedures, and a mix of lesion-specific postoperative complications. To cover multiple, high-complexity cases, we implemented Rapid Cycle Deliberate Practice method of teaching for selected simulation scenarios using an expert driven checklist. Knowledge was assessed with a pre-/posttest format (maximum score, 100%). A paired-sample t test showed a statistically significant increase in the posttest scores (mean [SD], pre test, 36.8% [14.3%] vs post test, 56.0% [15.8%]; p < 0.001). Time to recognize and treat an acute deterioration was evaluated through the use of selected high-fidelity simulation. Median time improved overall "time to task" across these scenarios. There was a significant increase in the proportion of clinically time-sensitive tasks completed within 5 minutes (pre, 60% [30/50] vs post, 86% [43/50]; p = 0.003] Confidence and satisfaction were evaluated with a validated tool ("Student Satisfaction and Self-Confidence in Learning"). Using a five-point Likert scale, the participants reported a high level of satisfaction (4.7 ± 0.30) and performance confidence (4.8 ± 0.31) with the simulation experience. Although simulation boot camps have been used effectively for training physicians and educating critical care providers, this was a novel approach to educating pediatric nurse practitioners from multiple academic centers. The course improved overall knowledge, and the pediatric nurse practitioners reported satisfaction and confidence in the simulation experience.

  19. Examining Road Traffic Mortality Status in China: A Simulation Study

    PubMed Central

    Schwebel, David C.; Li, Li; Hu, Guoqing

    2016-01-01

    Background Data from the Chinese police service suggest substantial reductions in road traffic injuries since 2002, but critics have questioned the accuracy of those data, especially considering conflicting data reported by the health department. Methods To address the gap between police and health department data and to determine which may be more accurate, we conducted a simulation study based on the modified Smeed equation, which delineates a non-linear relation between road traffic mortality and the level of motorization in a country or region. Our goal was to simulate trends in road traffic mortality in China and compare performances in road traffic safety management between China and 13 other countries. Results Chinese police data indicate a peak in road traffic mortalities in 2002 and a significant and a gradual decrease in population-based road traffic mortality since 2002. Health department data show the road traffic mortality peaked in 2012. In addition, police data suggest China’s road traffic mortality peaked at a much lower motorization level (0.061 motor vehicles per person) in 2002, followed by a reduction in mortality to a level comparable to that of developed countries. Simulation results based on health department data suggest high road traffic mortality, with a mortality peak in 2012 at a moderate motorization level (0.174 motor vehicles per person). Comparisons to the other 13 countries suggest the health data from China may be more valid than the police data. Conclusion Our simulation data indicate China is still at a stage of high road traffic mortality, as suggested by health data, rather than a stage of low road traffic mortality, as suggested by police data. More efforts are needed to integrate safety into road design, improve road traffic management, improve data quality, and alter unsafe behaviors of pedestrians, drivers and passengers in China. PMID:27071008

  20. A Survey of Instructional Support for Undergraduate Research Programs

    ERIC Educational Resources Information Center

    Hensley, Merinda Kaye

    2015-01-01

    Undergraduate research and other high-impact educational practices simulate real-world learning environments and present an opportunity for high-level information literacy teaching to be better incorporated into the curriculum. The purpose of this survey is to examine efforts of libraries currently offering IL instruction to undergraduate research…

  1. A functional language approach in high-speed digital simulation

    NASA Technical Reports Server (NTRS)

    Ercegovac, M. D.; Lu, S.-L.

    1983-01-01

    A functional programming approach for a multi-microprocessor architecture is presented. The language, based on Backus FP, its intermediate form and the translation process are discussed and illustrated with an example. The approach allows performance analysis to be performed at a high level as an aid in program partitioning.

  2. Middleware Trade Study for NASA Domain

    NASA Technical Reports Server (NTRS)

    Bowman, Dan

    2007-01-01

    This presentation presents preliminary results of a trade study designed to assess three distributed simulation middleware technologies for support of the NASA Constellation Distributed Space Exploration Simulation (DSES) project and Test and Verification Distributed System Integration Laboratory (DSIL). The technologies are: the High Level Architecture (HLA), the Test and Training Enabling Architecture (TENA), and an XML-based variant of Distributed Interactive Simulation (DIS-XML) coupled with the Extensible Messaging and Presence Protocol (XMPP). According to the criteria and weights determined in this study, HLA scores better than the other two for DSES as well as the DSIL

  3. Study of storm surge trends in typhoon-prone coastal areas based on observations and surge-wave coupled simulations

    NASA Astrophysics Data System (ADS)

    Feng, Xingru; Li, Mingjie; Yin, Baoshu; Yang, Dezhou; Yang, Hongwei

    2018-06-01

    This is a study of the storm surge trends in some of the typhoon-prone coastal areas of China. An unstructured-grid, storm surge-wave-tide coupled model was established for the coastal areas of Zhejiang, Fujian and Guangdong provinces. The coupled model has a high resolution in coastal areas, and the simulated results compared well with the in situ observations and satellite altimeter data. The typhoon-induced storm surges along the coast of the study areas were simulated based on the established coupled model for the past 20 years (1997-2016). The simulated results were used to analyze the trends of the storm surges in the study area. The extreme storm surge trends along the central coast of Fujian Province reached up to 0.06 m/y, significant at the 90% confidence level. The duration of the storm surges greater than 1.0 and 0.7 m had an increasing trend along the coastal area of northern Fujian Province, significant at confidence levels of 70%-91%. The simulated trends of the extreme storm surges were also validated by observations from two tide gauge stations. Further studies show that the correlation coefficient (RTE) between the duration of the storm surge greater than 1 m and the annual ENSO index can reach as high as 0.62, significant at the 99% confidence level. This occurred in a location where the storm surge trend was not significant. For the areas with significant increasing storm surge trends, RTE was small and not significant. This study identified the storm surge trends for the full complex coastline of the study area. These results are useful both for coastal management by the government and for coastal engineering design.

  4. Franck-Condon Simulations including Anharmonicity of the Ã(1)A''-X̃(1)A' Absorption and Single Vibronic Level Emission Spectra of HSiCl and DSiCl.

    PubMed

    Mok, Daniel W K; Lee, Edmond P F; Chau, Foo-Tim; Dyke, John M

    2009-03-10

    RCCSD(T) and/or CASSCF/MRCI calculations have been carried out on the X̃(1)A' and Ã(1)A'' states of HSiCl employing basis sets of up to the aug-cc-pV5Z quality. Contributions from core correlation and extrapolation to the complete basis set limit were included in determining the computed equilibrium geometrical parameters and relative electronic energy of these two states of HSiCl. Franck-Condon factors which include allowance for anharmonicity and Duschinsky rotation between these two states of HSiCl and DSiCl were calculated employing RCCSD(T) and CASSCF/MRCI potential energy functions, and were used to simulate the Ã(1)A'' ← X̃(1)A' absorption and Ã(1)A'' → X̃(1)A' single vibronic level (SVL) emission spectra of HSiCl and DSiCl. Simulated absorption and experimental LIF spectra, and simulated and observed Ã(1)A''(0,0,0) → X̃(1)A' SVL emission spectra, of HSiCl and DSiCl are in very good agreement. However, agreement between simulated and observed Ã(1)A''(0,1,0) → X̃(1)A' and Ã(1)A''(0,2,1) → X̃(1)A' SVL emission spectra of DSiCl is not as good. Preliminary calculations on low-lying excited states of HSiCl suggest that vibronic interaction between low-lying vibrational levels of the Ã(1)A'' state and highly excited vibrational levels of the ã(3)A'' is possible. Such vibronic interaction may change the character of the low-lying vibrational levels of the Ã(1)A'' state, which would lead to perturbation in the SVL emission spectra from these vibrational levels.

  5. Simulation of ground-water flow in the Cedar River alluvium, northwest Black Hawk County and southwest Bremer County, Iowa

    USGS Publications Warehouse

    Schaap, Bryan D.; Savoca, Mark E.; Turco, Michael J.

    2003-01-01

    In general, once high ground-water levels occur, either because of high Cedar River water Abstract levels or above normal local precipitation or both, ground-water in the central part of the study area along Highway 218 flows toward the south rather than following shorter flow paths to the Cedar River. Intermittent streams in the study area discharge substantial amounts of water from the ground-water flow system.

  6. A spatial panel ordered-response model with application to the analysis of urban land-use development intensity patterns

    NASA Astrophysics Data System (ADS)

    Ferdous, Nazneen; Bhat, Chandra R.

    2013-01-01

    This paper proposes and estimates a spatial panel ordered-response probit model with temporal autoregressive error terms to analyze changes in urban land development intensity levels over time. Such a model structure maintains a close linkage between the land owner's decision (unobserved to the analyst) and the land development intensity level (observed by the analyst) and accommodates spatial interactions between land owners that lead to spatial spillover effects. In addition, the model structure incorporates spatial heterogeneity as well as spatial heteroscedasticity. The resulting model is estimated using a composite marginal likelihood (CML) approach that does not require any simulation machinery and that can be applied to data sets of any size. A simulation exercise indicates that the CML approach recovers the model parameters very well, even in the presence of high spatial and temporal dependence. In addition, the simulation results demonstrate that ignoring spatial dependency and spatial heterogeneity when both are actually present will lead to bias in parameter estimation. A demonstration exercise applies the proposed model to examine urban land development intensity levels using parcel-level data from Austin, Texas.

  7. 1D-3D hybrid modeling—from multi-compartment models to full resolution models in space and time

    PubMed Central

    Grein, Stephan; Stepniewski, Martin; Reiter, Sebastian; Knodel, Markus M.; Queisser, Gillian

    2014-01-01

    Investigation of cellular and network dynamics in the brain by means of modeling and simulation has evolved into a highly interdisciplinary field, that uses sophisticated modeling and simulation approaches to understand distinct areas of brain function. Depending on the underlying complexity, these models vary in their level of detail, in order to cope with the attached computational cost. Hence for large network simulations, single neurons are typically reduced to time-dependent signal processors, dismissing the spatial aspect of each cell. For single cell or networks with relatively small numbers of neurons, general purpose simulators allow for space and time-dependent simulations of electrical signal processing, based on the cable equation theory. An emerging field in Computational Neuroscience encompasses a new level of detail by incorporating the full three-dimensional morphology of cells and organelles into three-dimensional, space and time-dependent, simulations. While every approach has its advantages and limitations, such as computational cost, integrated and methods-spanning simulation approaches, depending on the network size could establish new ways to investigate the brain. In this paper we present a hybrid simulation approach, that makes use of reduced 1D-models using e.g., the NEURON simulator—which couples to fully resolved models for simulating cellular and sub-cellular dynamics, including the detailed three-dimensional morphology of neurons and organelles. In order to couple 1D- and 3D-simulations, we present a geometry-, membrane potential- and intracellular concentration mapping framework, with which graph- based morphologies, e.g., in the swc- or hoc-format, are mapped to full surface and volume representations of the neuron and computational data from 1D-simulations can be used as boundary conditions for full 3D simulations and vice versa. Thus, established models and data, based on general purpose 1D-simulators, can be directly coupled to the emerging field of fully resolved, highly detailed 3D-modeling approaches. We present the developed general framework for 1D/3D hybrid modeling and apply it to investigate electrically active neurons and their intracellular spatio-temporal calcium dynamics. PMID:25120463

  8. Data flow modeling techniques

    NASA Technical Reports Server (NTRS)

    Kavi, K. M.

    1984-01-01

    There have been a number of simulation packages developed for the purpose of designing, testing and validating computer systems, digital systems and software systems. Complex analytical tools based on Markov and semi-Markov processes have been designed to estimate the reliability and performance of simulated systems. Petri nets have received wide acceptance for modeling complex and highly parallel computers. In this research data flow models for computer systems are investigated. Data flow models can be used to simulate both software and hardware in a uniform manner. Data flow simulation techniques provide the computer systems designer with a CAD environment which enables highly parallel complex systems to be defined, evaluated at all levels and finally implemented in either hardware or software. Inherent in data flow concept is the hierarchical handling of complex systems. In this paper we will describe how data flow can be used to model computer system.

  9. Crystal MD: The massively parallel molecular dynamics software for metal with BCC structure

    NASA Astrophysics Data System (ADS)

    Hu, Changjun; Bai, He; He, Xinfu; Zhang, Boyao; Nie, Ningming; Wang, Xianmeng; Ren, Yingwen

    2017-02-01

    Material irradiation effect is one of the most important keys to use nuclear power. However, the lack of high-throughput irradiation facility and knowledge of evolution process, lead to little understanding of the addressed issues. With the help of high-performance computing, we could make a further understanding of micro-level-material. In this paper, a new data structure is proposed for the massively parallel simulation of the evolution of metal materials under irradiation environment. Based on the proposed data structure, we developed the new molecular dynamics software named Crystal MD. The simulation with Crystal MD achieved over 90% parallel efficiency in test cases, and it takes more than 25% less memory on multi-core clusters than LAMMPS and IMD, which are two popular molecular dynamics simulation software. Using Crystal MD, a two trillion particles simulation has been performed on Tianhe-2 cluster.

  10. Highly efficient simulation environment for HDTV video decoder in VLSI design

    NASA Astrophysics Data System (ADS)

    Mao, Xun; Wang, Wei; Gong, Huimin; He, Yan L.; Lou, Jian; Yu, Lu; Yao, Qingdong; Pirsch, Peter

    2002-01-01

    With the increase of the complex of VLSI such as the SoC (System on Chip) of MPEG-2 Video decoder with HDTV scalability especially, simulation and verification of the full design, even as high as the behavior level in HDL, often proves to be very slow, costly and it is difficult to perform full verification until late in the design process. Therefore, they become bottleneck of the procedure of HDTV video decoder design, and influence it's time-to-market mostly. In this paper, the architecture of Hardware/Software Interface of HDTV video decoder is studied, and a Hardware-Software Mixed Simulation (HSMS) platform is proposed to check and correct error in the early design stage, based on the algorithm of MPEG-2 video decoding. The application of HSMS to target system could be achieved by employing several introduced approaches. Those approaches speed up the simulation and verification task without decreasing performance.

  11. Simulating the Effects of Sea Level Rise on the Resilience and Migration of Tidal Wetlands along the Hudson River

    PubMed Central

    Tabak, Nava M.; Laba, Magdeline; Spector, Sacha

    2016-01-01

    Sea Level Rise (SLR) caused by climate change is impacting coastal wetlands around the globe. Due to their distinctive biophysical characteristics and unique plant communities, freshwater tidal wetlands are expected to exhibit a different response to SLR as compared with the better studied salt marshes. In this study we employed the Sea Level Affecting Marshes Model (SLAMM), which simulates regional- or local-scale changes in tidal wetland habitats in response to SLR, and adapted it for application in a freshwater-dominated tidal river system, the Hudson River Estuary. Using regionally-specific estimated ranges of SLR and accretion rates, we produced simulations for a spectrum of possible future wetland distributions and quantified the projected wetland resilience, migration or loss in the HRE through the end of the 21st century. Projections of total wetland extent and migration were more strongly determined by the rate of SLR than the rate of accretion. Surprisingly, an increase in net tidal wetland area was projected under all scenarios, with newly-formed tidal wetlands expected to comprise at least 33% of the HRE’s wetland area by year 2100. Model simulations with high rates of SLR and/or low rates of accretion resulted in broad shifts in wetland composition with widespread conversion of high marsh habitat to low marsh, tidal flat or permanent inundation. Wetland expansion and resilience were not equally distributed through the estuary, with just three of 48 primary wetland areas encompassing >50% of projected new wetland by the year 2100. Our results open an avenue for improving predictive models of the response of freshwater tidal wetlands to sea level rise, and broadly inform the planning of conservation measures of this critical resource in the Hudson River Estuary. PMID:27043136

  12. Simulating the Effects of Sea Level Rise on the Resilience and Migration of Tidal Wetlands along the Hudson River.

    PubMed

    Tabak, Nava M; Laba, Magdeline; Spector, Sacha

    2016-01-01

    Sea Level Rise (SLR) caused by climate change is impacting coastal wetlands around the globe. Due to their distinctive biophysical characteristics and unique plant communities, freshwater tidal wetlands are expected to exhibit a different response to SLR as compared with the better studied salt marshes. In this study we employed the Sea Level Affecting Marshes Model (SLAMM), which simulates regional- or local-scale changes in tidal wetland habitats in response to SLR, and adapted it for application in a freshwater-dominated tidal river system, the Hudson River Estuary. Using regionally-specific estimated ranges of SLR and accretion rates, we produced simulations for a spectrum of possible future wetland distributions and quantified the projected wetland resilience, migration or loss in the HRE through the end of the 21st century. Projections of total wetland extent and migration were more strongly determined by the rate of SLR than the rate of accretion. Surprisingly, an increase in net tidal wetland area was projected under all scenarios, with newly-formed tidal wetlands expected to comprise at least 33% of the HRE's wetland area by year 2100. Model simulations with high rates of SLR and/or low rates of accretion resulted in broad shifts in wetland composition with widespread conversion of high marsh habitat to low marsh, tidal flat or permanent inundation. Wetland expansion and resilience were not equally distributed through the estuary, with just three of 48 primary wetland areas encompassing >50% of projected new wetland by the year 2100. Our results open an avenue for improving predictive models of the response of freshwater tidal wetlands to sea level rise, and broadly inform the planning of conservation measures of this critical resource in the Hudson River Estuary.

  13. A comparison of autonomic responses in humans induced by two simulation models of weightlessness: lower body positive pressure and 6 degrees head-down tilt.

    PubMed

    Fu, Q; Sugiyama, Y; Kamiya, A; Mano, T

    2000-04-12

    Six-degree head-down tilt (HDT) is well accepted as an effective weightlessness model in humans. However, some researchers utilized lower body positive pressure (LBPP) to simulate the cardiovascular and renal effects of a decreased gravitational stress. In order to determine whether LBPP was a suitable model for simulated weightlessness, we compared the differences between these two methods. Ten healthy males, aged 21-41 years, were subjected to graded LBPP at 10, 20 and 30 mmHg, as well as 6 degrees HDT. Muscle sympathetic nerve activity (MSNA) was microneurographically recorded from the tibial nerve along with cardiovascular variables. We found that MSNA decreased by 27% to a similar extent both at low levels of LBPP (10 and 20 mmHg) and HDT. However, at a high level of LBPP (30 mmHg), MSNA tended to increase. Mean arterial pressure was elevated significantly by 11% (10 mmHg) at 30 mmHg LBPP, but remained unchanged at low levels of LBPP and HDT. Heart rate did not change during the entire LBPP and HDT procedures. Total peripheral resistance markedly increased by 36% at 30 mmHg LBPP, but decreased by 9% at HDT. Both stroke volume and cardiac output tended to decrease at 30 mmHg LBPP, but increased at HDT. These results suggest that although both LBPP and HDT induce fluid shifts from the lower body toward the thoracic compartment, autonomic responses are different, especially at LBPP greater than 20 mmHg. We note that high levels of LBPP (>20 mmHg) activate not only cardiopulmonary and arterial baroreflexes, but also intramuscular mechanoreflexes, while 6 degrees HDT only activates cardiopulmonary baroreflexes. We conclude that LBPP is not a suitable model for simulated weightlessness in humans.

  14. High resolution simulations of aerosol microphysics in a global and regionally nested chemical transport model

    NASA Astrophysics Data System (ADS)

    Adams, P. J.; Marks, M.

    2015-12-01

    The aerosol indirect effect is the largest source of forcing uncertainty in current climate models. This effect arises from the influence of aerosols on the reflective properties and lifetimes of clouds, and its magnitude depends on how many particles can serve as cloud droplet formation sites. Assessing levels of this subset of particles (cloud condensation nuclei, or CCN) requires knowledge of aerosol levels and their global distribution, size distributions, and composition. A key tool necessary to advance our understanding of CCN is the use of global aerosol microphysical models, which simulate the processes that control aerosol size distributions: nucleation, condensation/evaporation, and coagulation. Previous studies have found important differences in CO (Chen, D. et al., 2009) and ozone (Jang, J., 1995) modeled at different spatial resolutions, and it is reasonable to believe that short-lived, spatially-variable aerosol species will be similarly - or more - susceptible to model resolution effects. The goal of this study is to determine how CCN levels and spatial distributions change as simulations are run at higher spatial resolution - specifically, to evaluate how sensitive the model is to grid size, and how this affects comparisons against observations. Higher resolution simulations are necessary supports for model/measurement synergy. Simulations were performed using the global chemical transport model GEOS-Chem (v9-02). The years 2008 and 2009 were simulated at 4ox5o and 2ox2.5o globally and at 0.5ox0.667o over Europe and North America. Results were evaluated against surface-based particle size distribution measurements from the European Supersites for Atmospheric Aerosol Research project. The fine-resolution model simulates more spatial and temporal variability in ultrafine levels, and better resolves topography. Results suggest that the coarse model predicts systematically lower ultrafine levels than does the fine-resolution model. Significant differences are also evident with respect to model-measurement comparisons, and will be discussed.

  15. Open-source framework for power system transmission and distribution dynamics co-simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Renke; Fan, Rui; Daily, Jeff

    The promise of the smart grid entails more interactions between the transmission and distribution networks, and there is an immediate need for tools to provide the comprehensive modelling and simulation required to integrate operations at both transmission and distribution levels. Existing electromagnetic transient simulators can perform simulations with integration of transmission and distribution systems, but the computational burden is high for large-scale system analysis. For transient stability analysis, currently there are only separate tools for simulating transient dynamics of the transmission and distribution systems. In this paper, we introduce an open source co-simulation framework “Framework for Network Co-Simulation” (FNCS), togethermore » with the decoupled simulation approach that links existing transmission and distribution dynamic simulators through FNCS. FNCS is a middleware interface and framework that manages the interaction and synchronization of the transmission and distribution simulators. Preliminary testing results show the validity and capability of the proposed open-source co-simulation framework and the decoupled co-simulation methodology.« less

  16. Collaborative modeling: the missing piece of distributed simulation

    NASA Astrophysics Data System (ADS)

    Sarjoughian, Hessam S.; Zeigler, Bernard P.

    1999-06-01

    The Department of Defense overarching goal of performing distributed simulation by overcoming geographic and time constraints has brought the problem of distributed modeling to the forefront. The High Level Architecture standard is primarily intended for simulation interoperability. However, as indicated, the existence of a distributed modeling infrastructure plays a fundamental and central role in supporting the development of distributed simulations. In this paper, we describe some fundamental distributed modeling concepts and their implications for constructing successful distributed simulations. In addition, we discuss the Collaborative DEVS Modeling environment that has been devised to enable graphically dispersed modelers to collaborate and synthesize modular and hierarchical models. We provide an actual example of the use of Collaborative DEVS Modeler in application to a project involving corporate partners developing an HLA-compliant distributed simulation exercise.

  17. An Effective Construction Method of Modular Manipulator 3D Virtual Simulation Platform

    NASA Astrophysics Data System (ADS)

    Li, Xianhua; Lv, Lei; Sheng, Rui; Sun, Qing; Zhang, Leigang

    2018-06-01

    This work discusses about a fast and efficient method of constructing an open 3D manipulator virtual simulation platform which make it easier for teachers and students to learn about positive and inverse kinematics of a robot manipulator. The method was carried out using MATLAB. In which, the Robotics Toolbox, MATLAB GUI and 3D animation with the help of modelling using SolidWorks, were fully applied to produce a good visualization of the system. The advantages of using quickly build is its powerful function of the input and output and its ability to simulate a 3D manipulator realistically. In this article, a Schunk six DOF modular manipulator was constructed by the author's research group to be used as example. The implementation steps of this method was detailed described, and thereafter, a high-level open and realistic visualization manipulator 3D virtual simulation platform was achieved. With the graphs obtained from simulation, the test results show that the manipulator 3D virtual simulation platform can be constructed quickly with good usability and high maneuverability, and it can meet the needs of scientific research and teaching.

  18. Comparison of Seasonal Terrestrial Water Storage Variations from GRACE with Groundwater-level Measurements from the High Plains Aquifer (USA)

    NASA Technical Reports Server (NTRS)

    Strassberg, Gil; Scanlon, Bridget R.; Rodell, Matthew

    2007-01-01

    This study presents the first direct comparison of variations in seasonal GWS derived from GRACE TWS and simulated SM with GW-level measurements in a semiarid region. Results showed that variations in GWS and SM are the main sources controlling TWS changes over the High Plains, with negligible storage changes from surface water, snow, and biomass. Seasonal variations in GRACE TWS compare favorably with combined GWS from GW-level measurements (total 2,700 wells, average 1,050 GW-level measurements per season) and simulated SM from the Noah land surface model (R = 0.82, RMSD = 33 mm). Estimated uncertainty in seasonal GRACE-derived TWS is 8 mm, and estimated uncertainty in TWS changes is 11 mm. Estimated uncertainty in SM changes is 11 mm and combined uncertainty for TWS-SM changes is 15 mm. Seasonal TWS changes are detectable in 7 out of 9 monitored periods and maximum changes within a year (e.g. between winter and summer) are detectable in all 5 monitored periods. Grace-derived GWS calculated from TWS-SM generally agrees with estimates based on GW-level measurements (R = 0.58, RMSD = 33 mm). Seasonal TWS-SM changes are detectable in 5 out of the 9 monitored periods and maximum changes are detectable in all 5 monitored periods. Good correspondence between GRACE data and GW-level measurements from the intensively monitored High Plains aquifer validates the potential for using GRACE TWS and simulated SM to monitor GWS changes and aquifer depletion in semiarid regions subjected to intensive irrigation pumpage. This method can be used to monitor regions where large-scale aquifer depletion is ongoing, and in situ measurements are limited, such as the North China Plain or western India. This potential should be enhanced by future advances in GRACE processing, which will improve the spatial and temporal resolution of TWS changes, and will further increase applicability of GRACE data for monitoring GWS.

  19. A High-Fidelity Simulation of a Generic Commercial Aircraft Engine and Controller

    NASA Technical Reports Server (NTRS)

    May, Ryan D.; Csank, Jeffrey; Lavelle, Thomas M.; Litt, Jonathan S.; Guo, Ten-Huei

    2010-01-01

    A new high-fidelity simulation of a generic 40,000 lb thrust class commercial turbofan engine with a representative controller, known as CMAPSS40k, has been developed. Based on dynamic flight test data of a highly instrumented engine and previous engine simulations developed at NASA Glenn Research Center, this non-proprietary simulation was created especially for use in the development of new engine control strategies. C-MAPSS40k is a highly detailed, component-level engine model written in MATLAB/Simulink (The MathWorks, Inc.). Because the model is built in Simulink, users have the ability to use any of the MATLAB tools for analysis and control system design. The engine components are modeled in C-code, which is then compiled to allow faster-than-real-time execution. The engine controller is based on common industry architecture and techniques to produce realistic closed-loop transient responses while ensuring that no safety or operability limits are violated. A significant feature not found in other non-proprietary models is the inclusion of transient stall margin debits. These debits provide an accurate accounting of the compressor surge margin, which is critical in the design of an engine controller. This paper discusses the development, characteristics, and capabilities of the C-MAPSS40k simulation

  20. Modeling study of PM2.5 concentration change in Beijing-Tianjin-Hebei and Chengdu-Chongqing urban clusters

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Chen, H.; Wu, Q.; Wang, Z.

    2016-12-01

    With the rapid economic development, air pollution is becoming more and more serious in China. Fine particulate matter (PM2.5) is one of major air pollution, affecting visibility and human health. In this study, a nested grid air quality model system (NAQPMS) was used to simulate PM2.5 during 2013-2015 in high resolution of 5km in China. Comparison with observations proved that NAQPMS was able to reproduce the temporal and spatial variation of pollutants in China, reasonably. The simulation showed that high levels of PM2.5 concentrated in the mid-eastern and Sichuan Basin, the concentration in the heaviest period was 120μg/m³ . This study focused on the Beijing-Tianjin-Hebei (BTH) and Chengdu-Chongqi urban clusters, the simulation was a little lower in Jing-Jin-Ji region, high concentration of PM2.5 concentrated in south of Hebei, and PM2.5 concentration in this region have gradually decreased over past three years, the best simulation result was 2014 in Beijing (R=0.75). However, the simulation was a bit higher in Chengdu-Chongqi urban cluster, high concentration concentrated in mid-eastern of Sichuan Basin, R increased obviously in 2015 (0.60). More detailed information and a possible cause for this discrepancy will be discussed.

  1. Use of the quasi-geostrophic dynamical framework to reconstruct the 3-D ocean state in a high-resolution realistic simulation of North Atlantic.

    NASA Astrophysics Data System (ADS)

    Fresnay, Simon; Ponte, Aurélien

    2017-04-01

    The quasi-geostrophic (QG) framework has been, is and will be still for years to come a cornerstone method linking observations with estimates of the ocean circulation and state. We have used here the QG framework to reconstruct dynamical variables of the 3-D ocean in a state-of-the-art high-resolution (1/60 deg, 300 vertical levels) numerical simulation of the North Atlantic (NATL60). The work was carried out in 3 boxes of the simulation: Gulf Stream, Azores and Reykjaness Ridge. In a first part, general diagnostics describing the eddying dynamics have been performed and show that the QG scaling verifies in general, at depths distant from mixed layer and bathymetric gradients. Correlations with surface observables variables (e.g. temperature, sea level) were computed and estimates of quasi-geostrophic potential vorticity (QGPV) were reconstructed by the means of regression laws. It is shown that that reconstruction of QGPV exhibits valuable skill for a restricted scale range, mainly using sea level as the variable of regression. Additional discussion is given, based on the flow balanced with QGPV. This work is part of the DIMUP project, aiming to improve our ability to operationnaly estimate the ocean state.

  2. Surgical model-view-controller simulation software framework for local and collaborative applications

    PubMed Central

    Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2010-01-01

    Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933

  3. Surgical model-view-controller simulation software framework for local and collaborative applications.

    PubMed

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  4. Numerical Simulation of Ground-Water Salinization in the Arkansas River Corridor, Southwest Kansas

    NASA Astrophysics Data System (ADS)

    Whittemore, D. O.; Perkins, S.; Tsou, M.; McElwee, C. D.; Zhan, X.; Young, D. P.

    2001-12-01

    The salinity of ground water in the High Plains aquifer underlying the upper Arkansas River corridor in southwest Kansas has greatly increased during the last few decades. The source of the salinization is infiltration of Arkansas River water along the river channel and in areas irrigated with diverted river water. The saline river water is derived from southeastern Colorado where consumptive losses of water in irrigation systems substantially concentrate dissolved solids in the residual water. Before development of surface- and ground-water resources, the Arkansas River gained flow along nearly all of its length in southwest Kansas. Since the 1970's, ground-water levels have declined in the High Plains aquifer from consumptive use of ground water. The water-level declines have now changed the river to a generally losing rather than gaining system. We simulated ground-water flow in the aquifers underlying 126 miles of the river corridor using MODFLOW integrated with the GIS software ArcView (Tsou and Whittemore, 2001). There are two layers in the model, one for the Quaternary alluvial aquifer and the other for the underlying High Plains aquifer. We prepared a simulation for circa 1940 that represented conditions prior to substantial ground-water development, and simulations for 40 years into the future that were based on holding constant either average water use or average ground-water levels for the 1990's. Streamflows along the river computed from the model results illustrated the flow gains from ground-water discharge for circa 1940 and losses during the 1990's. We modeled the movement of salinity as particle tracks generated by MODPATH based on the MODFLOW solutions. The results indicate that during the next 40 years, saline water will move a substantial distance in the High Plains aquifer on the south side of the central portion of the river valley. The differences between the circa 1940 and 1990's simulations fit the observed data that show large increases in the dissolved solids of ground waters in the High Plains aquifer in portions of the river corridor. The modeling indicates that management of water use in the aquifers on a large scale would be necessary to achieve significant changes in the rate and direction of saline water migration over a time scale of decades. >http://www.kgs.ukans.edu/Hydro/UARC/index.html

  5. High Current Density Scandate Cathodes for Future Vacuum Electronics Applications

    DTIC Science & Technology

    2008-05-30

    of Technology HFSS Ansoft Corporation’s High Frequency Structure Simulator TWT Traveling Wave Tube - device for generating high levels of RF power ...cathodes are practical for high power RF sources. Typical thermi- onic cathodes consists of a tungsten matrix impregnated with a mixture of barium oxide...electron beam with the largest possible diameter, consistent with high gain, bandwidth, and efficiency at W- Band . The research concentrated on photonic

  6. Comparative study of signalling methods for high-speed backplane transceiver

    NASA Astrophysics Data System (ADS)

    Wu, Kejun

    2017-11-01

    A combined analysis of transient simulation and statistical method is proposed for comparative study of signalling methods applied to high-speed backplane transceivers. This method enables fast and accurate signal-to-noise ratio and symbol error rate estimation of a serial link based on a four-dimension design space, including channel characteristics, noise scenarios, equalisation schemes, and signalling methods. The proposed combined analysis method chooses an efficient sampling size for performance evaluation. A comparative study of non-return-to-zero (NRZ), PAM-4, and four-phase shifted sinusoid symbol (PSS-4) using parameterised behaviour-level simulation shows PAM-4 and PSS-4 has substantial advantages over conventional NRZ in most of the cases. A comparison between PAM-4 and PSS-4 shows PAM-4 gets significant bit error rate degradation when noise level is enhanced.

  7. Development of simulation-based learning programme for improving adherence to time-out protocol on high-risk invasive procedures outside of operating room.

    PubMed

    Jeong, Eun Ju; Chung, Hyun Soo; Choi, Jeong Yun; Kim, In Sook; Hong, Seong Hee; Yoo, Kyung Sook; Kim, Mi Kyoung; Won, Mi Yeol; Eum, So Yeon; Cho, Young Soon

    2017-06-01

    The aim of this study was to develop a simulation-based time-out learning programme targeted to nurses participating in high-risk invasive procedures and to figure out the effects of application of the new programme on acceptance of nurses. This study was performed using a simulation-based learning predesign and postdesign to figure out the effects of implementation of this programme. It was targeted to 48 registered nurses working in the general ward and the emergency department in a tertiary teaching hospital. Difference between acceptance and performance rates has been figured out by using mean, standard deviation, and Wilcoxon-signed rank test. The perception survey and score sheet have been validated through content validation index, and the reliability of evaluator has been verified by using intraclass correlation coefficient. Results showed high level of acceptance of high-risk invasive procedure (P<.01). Further, improvement was consistent regardless of clinical experience, workplace, or experience in simulation-based learning. The face validity of the programme showed over 4.0 out of 5.0. This simulation-based learning programme was effective in improving the recognition of time-out protocol and has given the participants the opportunity to become proactive in cases of high-risk invasive procedures performed outside of operating room. © 2017 John Wiley & Sons Australia, Ltd.

  8. Membrane Properties and the Balance between Excitation and Inhibition Control Gamma-Frequency Oscillations Arising from Feedback Inhibition

    PubMed Central

    Economo, Michael N.; White, John A.

    2012-01-01

    Computational studies as well as in vivo and in vitro results have shown that many cortical neurons fire in a highly irregular manner and at low average firing rates. These patterns seem to persist even when highly rhythmic signals are recorded by local field potential electrodes or other methods that quantify the summed behavior of a local population. Models of the 30–80 Hz gamma rhythm in which network oscillations arise through ‘stochastic synchrony’ capture the variability observed in the spike output of single cells while preserving network-level organization. We extend upon these results by constructing model networks constrained by experimental measurements and using them to probe the effect of biophysical parameters on network-level activity. We find in simulations that gamma-frequency oscillations are enabled by a high level of incoherent synaptic conductance input, similar to the barrage of noisy synaptic input that cortical neurons have been shown to receive in vivo. This incoherent synaptic input increases the emergent network frequency by shortening the time scale of the membrane in excitatory neurons and by reducing the temporal separation between excitation and inhibition due to decreased spike latency in inhibitory neurons. These mechanisms are demonstrated in simulations and in vitro current-clamp and dynamic-clamp experiments. Simulation results further indicate that the membrane potential noise amplitude has a large impact on network frequency and that the balance between excitatory and inhibitory currents controls network stability and sensitivity to external inputs. PMID:22275859

  9. Early dynamical evolution of young substructured clusters

    NASA Astrophysics Data System (ADS)

    Dorval, Julien; Boily, Christian

    2017-03-01

    Stellar clusters form with a high level of substructure, inherited from the molecular cloud and the star formation process. Evidence from observations and simulations also indicate the stars in such young clusters form a subvirial system. The subsequent dynamical evolution can cause important mass loss, ejecting a large part of the birth population in the field. It can also imprint the stellar population and still be inferred from observations of evolved clusters. Nbody simulations allow a better understanding of these early twists and turns, given realistic initial conditions. Nowadays, substructured, clumpy young clusters are usually obtained through pseudo-fractal growth and velocity inheritance. We introduce a new way to create clumpy initial conditions through a ''Hubble expansion'' which naturally produces self consistent clumps, velocity-wise. In depth analysis of the resulting clumps shows consistency with hydrodynamical simulations of young star clusters. We use these initial conditions to investigate the dynamical evolution of young subvirial clusters. We find the collapse to be soft, with hierarchical merging leading to a high level of mass segregation. The subsequent evolution is less pronounced than the equilibrium achieved from a cold collapse formation scenario.

  10. Influence of land use on rainfall simulation results in the Souss basin, Morocco

    NASA Astrophysics Data System (ADS)

    Peter, Klaus Daniel; Ries, Johannes B.; Hssaine, Ali Ait

    2013-04-01

    Situated between the High and Anti-Atlas, the Souss basin is characterized by a dynamic land use change. It is one of the fastest growing agricultural regions of Morocco. Traditional mixed agriculture is replaced by extensive plantations of citrus fruits, bananas and vegetables in monocropping, mainly for the European market. For the implementation of the land use change and further expansion of the plantations into marginal land which was former unsuitable for agriculture, land levelling by heavy machinery is used to plane the fields and close the widespread gullies. These gully systems are cutting deep between the plantations and other arable land. Their development started already over 400 years ago with the introduction of sugar production. Heavy rainfall events lead to further strong soil and gully erosion in this with 200 mm mean annual precipitation normally arid region. Gullies are cutting into the arable land or are re-excavating their old stream courses. On the test sites around the city of Taroudant, a total of 122 rainfall simulations were conducted to analyze the susceptibility of soils to surface runoff and soil erosion under different land use. A small portable nozzle rainfall simulator is used for the rainfall simulation experiments, quantifying runoff and erosion rates on micro-plots with a size of 0.28 m2. A motor pump boosts the water regulated by a flow metre into the commercial full cone nozzle at a height of 2 m. The rainfall intensity is maintained at about 40 mm h-1 for each of the 30 min lasting experiments. Ten categories of land use are classified for different stages of levelling, fallow land, cultivation and rangeland. Results show that mean runoff coefficients and mean sediment loads are significantly higher (1.4 and 3.5 times respectively) on levelled study sites compared to undisturbed sites. However, the runoff coefficients of all land use types are relatively equal and reach high median coefficients from 39 to 56 %. Only the rainfall simulations underneath mandarin trees in a plantation show with 10 % low coefficients. The results are stronger differentiated for the sediment loads. On levelled areas, the simulations reach median sediment loads of 41 and 61 g m-2 respectively. In spite of high runoff coefficients, the lowest sediment loads of around 4.5 g m-2 are measured on old fallow land (>5 y.) and rangeland which are both protected by biological crusts. The same low result is found on the mandarin plantation. On other younger fallow land (1-2, 2-5 y.) as well as on stone covered badlands and sundry anthropogenic influenced soils medium soil losses between 18 and 25 g m-2 are reached. On sparsely vegetated grain fields, soil erosion is because of initiated crusting despite lower runoff coefficients with 30 g m-2 still high. Land-levelling measures have the greatest influence on rainfall simulation results. Although runoff coefficients on almost all land use types are similar, clear differences of soil erosion due to different land use can be identified.

  11. Development of a High-Fidelity Simulation Environment for Shadow-Mode Assessments of Air Traffic Concepts

    NASA Technical Reports Server (NTRS)

    Robinson, John E., III; Lee, Alan; Lai, Chok Fung

    2017-01-01

    This paper describes the Shadow-Mode Assessment Using Realistic Technologies for the National Airspace System (SMART-NAS) Test Bed. The SMART-NAS Test Bed is an air traffic simulation platform being developed by the National Aeronautics and Space Administration (NASA). The SMART-NAS Test Bed's core purpose is to conduct high-fidelity, real-time, human-in-the-loop and automation-in-the-loop simulations of current and proposed future air traffic concepts for the United States' Next Generation Air Transportation System called NextGen. The setup, configuration, coordination, and execution of realtime, human-in-the-loop air traffic management simulations are complex, tedious, time intensive, and expensive. The SMART-NAS Test Bed framework is an alternative to the current approach and will provide services throughout the simulation workflow pipeline to help alleviate these shortcomings. The principle concepts to be simulated include advanced gate-to-gate, trajectory-based operations, widespread integration of novel aircraft such as unmanned vehicles, and real-time safety assurance technologies to enable autonomous operations. To make this possible, SNTB will utilize Web-based technologies, cloud resources, and real-time, scalable, communication middleware. This paper describes the SMART-NAS Test Bed's vision, purpose, its concept of use, and the potential benefits, key capabilities, high-level requirements, architecture, software design, and usage.

  12. An equation-of-state-meter of quantum chromodynamics transition from deep learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pang, Long-Gang; Zhou, Kai; Su, Nan

    A primordial state of matter consisting of free quarks and gluons that existed in the early universe a few microseconds after the Big Bang is also expected to form in high-energy heavy-ion collisions. Determining the equation of state (EoS) of such a primordial matter is the ultimate goal of high-energy heavy-ion experiments. Here we use supervised learning with a deep convolutional neural network to identify the EoS employed in the relativistic hydrodynamic simulations of heavy ion collisions. High-level correlations of particle spectra in transverse momentum and azimuthal angle learned by the network act as an effective EoS-meter in deciphering themore » nature of the phase transition in quantum chromodynamics. Finally, such EoS-meter is model-independent and insensitive to other simulation inputs including the initial conditions for hydrodynamic simulations.« less

  13. An equation-of-state-meter of quantum chromodynamics transition from deep learning

    DOE PAGES

    Pang, Long-Gang; Zhou, Kai; Su, Nan; ...

    2018-01-15

    A primordial state of matter consisting of free quarks and gluons that existed in the early universe a few microseconds after the Big Bang is also expected to form in high-energy heavy-ion collisions. Determining the equation of state (EoS) of such a primordial matter is the ultimate goal of high-energy heavy-ion experiments. Here we use supervised learning with a deep convolutional neural network to identify the EoS employed in the relativistic hydrodynamic simulations of heavy ion collisions. High-level correlations of particle spectra in transverse momentum and azimuthal angle learned by the network act as an effective EoS-meter in deciphering themore » nature of the phase transition in quantum chromodynamics. Finally, such EoS-meter is model-independent and insensitive to other simulation inputs including the initial conditions for hydrodynamic simulations.« less

  14. Faunus: An object oriented framework for molecular simulation

    PubMed Central

    Lund, Mikael; Trulsson, Martin; Persson, Björn

    2008-01-01

    Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331

  15. Radio Measurements of Air Showers with LOPES

    NASA Astrophysics Data System (ADS)

    Schröder, F. G.; Apel, W. D.; Arteaga-Velazquez, J. C.; Bähren, L.; Bekk, K.; Bertaina, M.; Biermann, P. L.; Blümer, J.; Bozdog, H.; Brancus, I. M.; Cantoni, E.; Chiavassa, A.; Daumiller, K.; de Souza, V.; Di Pierro, F.; Doll, P.; Engel, R.; Falcke, H.; Fuchs, B.; Fuhrmann, D.; Gemmeke, H.; Grupen, C.; Haungs, A.; Heck, D.; Hörandel, J. R.; Horneffer, A.; Huber, D.; Huege, T.; Isar, P. G.; Kampert, K.-H.; Kang, D.; Krömer, O.; Kuijpers, J.; Link, K.; Łuczak, P.; Ludwig, M.; Mathes, H. J.; Melissas, M.; Morello, C.; Oehlschläger, J.; Palmieri, N.; Pierog, T.; Rautenberg, J.; Rebel, H.; Roth, M.; Rühle, C.; Saftoiu, A.; Schieler, H.; Schmidt, A.; Sima, O.; Toma, G.; Trinchero, G. C.; Weindl, A.; Wochele, J.; Zabierowski, J.; Zensus, J. A.

    2013-02-01

    LOPES is a digital antenna array for the radio measurement of cosmic-ray air showers at energies around 1017 eV. It is triggered by the KASCADE-Grande air-shower array at the Karlsruhe Institute of Technology (KIT), Germany. Because of an absolute amplitude calibration and a sophisticated data analysis, LOPES can test models for the radio emission to an up-to-now unachieved level, thus improving our understanding of the radio emission mechanisms. Recent REAS simulations of the air-shower radio emission come closer to the measurements than any previously tested simulations. We have determined the radio-reconstruction precision of interesting air-shower parameters by comparing LOPES reconstructions to both REAS simulations and KASCADE-Grande measurements, and present our latest results for the angular resolution, the energy and the Xmax reconstruction based on the radio measurement of about 500 air showers. Although the precision of LOPES is limited by the high level of anthropogenic noise at KIT, it opens a promising perspective for next-generation radio arrays in regions with a lower ambient noise level.

  16. Ab Initio Simulations and Electronic Structure of Lithium-Doped Ionic Liquids: Structure, Transport, and Electrochemical Stability.

    PubMed

    Haskins, Justin B; Bauschlicher, Charles W; Lawson, John W

    2015-11-19

    Density functional theory (DFT), density functional theory molecular dynamics (DFT-MD), and classical molecular dynamics using polarizable force fields (PFF-MD) are employed to evaluate the influence of Li(+) on the structure, transport, and electrochemical stability of three potential ionic liquid electrolytes: N-methyl-N-butylpyrrolidinium bis(trifluoromethanesulfonyl)imide ([pyr14][TFSI]), N-methyl-N-propylpyrrolidinium bis(fluorosulfonyl)imide ([pyr13][FSI]), and 1-ethyl-3-methylimidazolium boron tetrafluoride ([EMIM][BF4]). We characterize the Li(+) solvation shell through DFT computations of [Li(Anion)n]((n-1)-) clusters, DFT-MD simulations of isolated Li(+) in small ionic liquid systems, and PFF-MD simulations with high Li-doping levels in large ionic liquid systems. At low levels of Li-salt doping, highly stable solvation shells having two to three anions are seen in both [pyr14][TFSI] and [pyr13][FSI], whereas solvation shells with four anions dominate in [EMIM][BF4]. At higher levels of doping, we find the formation of complex Li-network structures that increase the frequency of four anion-coordinated solvation shells. A comparison of computational and experimental Raman spectra for a wide range of [Li(Anion)n]((n-1)-) clusters shows that our proposed structures are consistent with experiment. We then compute the ion diffusion coefficients and find measures from small-cell DFT-MD simulations to be the correct order of magnitude, but influenced by small system size and short simulation length. Correcting for these errors with complementary PFF-MD simulations, we find DFT-MD measures to be in close agreement with experiment. Finally, we compute electrochemical windows from DFT computations on isolated ions, interacting cation/anion pairs, and liquid-phase systems with Li-doping. For the molecular-level computations, we generally find the difference between ionization energy and electron affinity from isolated ions and interacting cation/anion pairs to provide upper and lower bounds, respectively, to experiment. In the liquid phase, we find the difference between the lowest unoccupied and highest occupied electronic levels in pure and hybrid functionals to provide lower and upper bounds, respectively, to experiment. Li-doping in the liquid-phase systems results in electrochemical windows little changed from the neat systems.

  17. Modeling, Simulation and Analysis of Public Key Infrastructure

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Tuey, Richard; Ma, Paul (Technical Monitor)

    1998-01-01

    Security is an essential part of network communication. The advances in cryptography have provided solutions to many of the network security requirements. Public Key Infrastructure (PKI) is the foundation of the cryptography applications. The main objective of this research is to design a model to simulate a reliable, scalable, manageable, and high-performance public key infrastructure. We build a model to simulate the NASA public key infrastructure by using SimProcess and MatLab Software. The simulation is from top level all the way down to the computation needed for encryption, decryption, digital signature, and secure web server. The application of secure web server could be utilized in wireless communications. The results of the simulation are analyzed and confirmed by using queueing theory.

  18. An application of sedimentation simulation in Tahe oilfield

    NASA Astrophysics Data System (ADS)

    Tingting, He; Lei, Zhao; Xin, Tan; Dongxu, He

    2017-12-01

    The braided river delta develops in Triassic low oil formation in the block 9 of Tahe oilfield, but its sedimentation evolution process is unclear. By using sedimentation simulation technology, sedimentation process and distribution of braided river delta are studied based on the geological parameters including sequence stratigraphic division, initial sedimentation environment, relative lake level change and accommodation change, source supply and sedimentary transport pattern. The simulation result shows that the error rate between strata thickness of simulation and actual strata thickness is small, and the single well analysis result of simulation is highly consistent with the actual analysis, which can prove that the model is reliable. The study area belongs to braided river delta retrogradation evolution process, which provides favorable basis for fine reservoir description and prediction.

  19. A Coordinated Initialization Process for the Distributed Space Exploration Simulation (DSES)

    NASA Technical Reports Server (NTRS)

    Phillips, Robert; Dexter, Dan; Hasan, David; Crues, Edwin Z.

    2007-01-01

    This document describes the federate initialization process that was developed at the NASA Johnson Space Center with the HIIA Transfer Vehicle Flight Controller Trainer (HTV FCT) simulations and refined in the Distributed Space Exploration Simulation (DSES). These simulations use the High Level Architecture (HLA) IEEE 1516 to provide the communication and coordination between the distributed parts of the simulation. The purpose of the paper is to describe a generic initialization sequence that can be used to create a federate that can: 1. Properly initialize all HLA objects, object instances, interactions, and time management 2. Check for the presence of all federates 3. Coordinate startup with other federates 4. Robustly initialize and share initial object instance data with other federates.

  20. Defense Modeling and Simulation Initiative

    DTIC Science & Technology

    1992-05-01

    project solicitation and priority ranking process, and reviewing policy issues . The activities of the DMSO and MSWG are also supported by a series of... issues have been raised for discussion, including: *Proumulgation of standards for the interoperability of models and simulations " Modeling and...have been completed or will be completed in the near term. The policy issues should be defined at a high level in the near term, although their

  1. National nutrition planning in developing countries via gaming-simulation.

    PubMed

    Duke, R D; Cary, R

    1977-01-01

    A nutrition game designed for the Food Policy and Nutrition Division of the Food and Agriculture Organization of the United Nations to aid in planning national nutrition education programs in Third World countries is described. The Simulated Nutrition System Game allows high-level ministerial staff in developing countries to discuss, via a common language created by SNUS I, the issues, problems, and complexities of national nutrition programs.

  2. Atmospheric effects on METSAT data

    NASA Technical Reports Server (NTRS)

    Johnson, W. R.

    1983-01-01

    When using the J. V. Dave dataset, two channels of simulated METSAT advanced very high resolution radiometer (AVHRR) data compare favorably with actual data. Simulated NOAA6 and NOAA7 AVHRR data are presented as radiance profiles of reflected solar energy through atmosphere with three different aerosol levels. Effects of the atmosphere on the data are presented as functions of satellite view angle or pixel position on scanline. Vegetative index simultations are also profiled.

  3. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks

    PubMed Central

    Naveros, Francisco; Garrido, Jesus A.; Carrillo, Richard R.; Ros, Eduardo; Luque, Niceto R.

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under increasing levels of neural complexity. PMID:28223930

  4. Comparison of hybrid and pure Monte Carlo shower generators on an event by event basis

    NASA Astrophysics Data System (ADS)

    Allen, J.; Drescher, H.-J.; Farrar, G.

    SENECA is a hybrid air shower simulation written by H. Drescher that utilizes both Monte Carlo simulation and cascade equations. By using the cascade equations only in the high energy portion of the shower, where they are extremely accurate, SENECA is able to utilize the advantages in speed from the cascade equations yet still produce complete, three dimensional particle distributions at ground level. We present a comparison, on an event by event basis, of SENECA and CORSIKA, a well trusted MC simulation. By using the same first interaction in both SENECA and CORSIKA, the effect of the cascade equations can be studied within a single shower, rather than averages over many showers. Our study shows that for showers produced in this manner, SENECA agrees with CORSIKA to a very high accuracy as to densities, energies, and timing information for individual species of ground-level particles from both iron and proton primaries with energies between 1EeV and 100EeV. Used properly, SENECA produces ground particle distributions virtually indistinguishable from those of CORSIKA in a fraction of the time. For example, for a shower induced by a 40 EeV proton simulated with 10-6 thinning, SENECA is 10 times faster than CORSIKA.

  5. Highly improved staggered quarks on the lattice with applications to charm physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follana, E.; Davies, C.; Wong, K.

    2007-03-01

    We use perturbative Symanzik improvement to create a new staggered-quark action (HISQ) that has greatly reduced one-loop taste-exchange errors, no tree-level order a{sup 2} errors, and no tree-level order (am){sup 4} errors to leading order in the quark's velocity v/c. We demonstrate with simulations that the resulting action has taste-exchange interactions that are 3-4 times smaller than the widely used ASQTAD action. We show how to bound errors due to taste exchange by comparing ASQTAD and HISQ simulations, and demonstrate with simulations that such errors are likely no more than 1% when HISQ is used for light quarks at latticemore » spacings of 1/10 fm or less. The suppression of (am){sup 4} errors also makes HISQ the most accurate discretization currently available for simulating c quarks. We demonstrate this in a new analysis of the {psi}-{eta}{sub c} mass splitting using the HISQ action on lattices where am{sub c}=0.43 and 0.66, with full-QCD gluon configurations (from MILC). We obtain a result of 111(5) MeV which compares well with the experiment. We discuss applications of this formalism to D physics and present our first high-precision results for D{sub s} mesons.« less

  6. PuReMD-GPU: A reactive molecular dynamics simulation package for GPUs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kylasa, S.B., E-mail: skylasa@purdue.edu; Aktulga, H.M., E-mail: hmaktulga@lbl.gov; Grama, A.Y., E-mail: ayg@cs.purdue.edu

    2014-09-01

    We present an efficient and highly accurate GP-GPU implementation of our community code, PuReMD, for reactive molecular dynamics simulations using the ReaxFF force field. PuReMD and its incorporation into LAMMPS (Reax/C) is used by a large number of research groups worldwide for simulating diverse systems ranging from biomembranes to explosives (RDX) at atomistic level of detail. The sub-femtosecond time-steps associated with ReaxFF strongly motivate significant improvements to per-timestep simulation time through effective use of GPUs. This paper presents, in detail, the design and implementation of PuReMD-GPU, which enables ReaxFF simulations on GPUs, as well as various performance optimization techniques wemore » developed to obtain high performance on state-of-the-art hardware. Comprehensive experiments on model systems (bulk water and amorphous silica) are presented to quantify the performance improvements achieved by PuReMD-GPU and to verify its accuracy. In particular, our experiments show up to 16× improvement in runtime compared to our highly optimized CPU-only single-core ReaxFF implementation. PuReMD-GPU is a unique production code, and is currently available on request from the authors.« less

  7. Unsteady 3D flow simulations in cranial arterial tree

    NASA Astrophysics Data System (ADS)

    Grinberg, Leopold; Anor, Tomer; Madsen, Joseph; Karniadakis, George

    2008-11-01

    High resolution unsteady 3D flow simulations in major cranial arteries have been performed. Two cases were considered: 1) a healthy volunteer with a complete Circle of Willis (CoW); and 2) a patient with hydrocephalus and an incomplete CoW. Computation was performed on 3344 processors of the new half petaflop supercomputer in TACC. Two new numerical approaches were developed and implemented: 1) a new two-level domain decomposition method, which couples continuous and discontinuous Galerkin discretization of the computational domain; and 2) a new type of outflow boundary conditions, which imposes, in an accurate and computationally efficient manner, clinically measured flow rates. In the first simulation, a geometric model of 65 cranial arteries was reconstructed. Our simulation reveals a high degree of asymmetry in the flow at the left and right parts of the CoW and the presence of swirling flow in most of the CoW arteries. In the second simulation, one of the main findings was a high pressure drop at the right anterior communicating artery (PCA). Due to the incompleteness of the CoW and the pressure drop at the PCA, the right internal carotid artery supplies blood to most regions of the brain.

  8. Development of Partially-Coherent Wavefront Propagation Simulation Methods for 3rd and 4th Generation Synchrotron Radiation Sources.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chubar O.; Berman, L; Chu, Y.S.

    2012-04-04

    Partially-coherent wavefront propagation calculations have proven to be feasible and very beneficial in the design of beamlines for 3rd and 4th generation Synchrotron Radiation (SR) sources. These types of calculations use the framework of classical electrodynamics for the description, on the same accuracy level, of the emission by relativistic electrons moving in magnetic fields of accelerators, and the propagation of the emitted radiation wavefronts through beamline optical elements. This enables accurate prediction of performance characteristics for beamlines exploiting high SR brightness and/or high spectral flux. Detailed analysis of radiation degree of coherence, offered by the partially-coherent wavefront propagation method, ismore » of paramount importance for modern storage-ring based SR sources, which, thanks to extremely small sub-nanometer-level electron beam emittances, produce substantial portions of coherent flux in X-ray spectral range. We describe the general approach to partially-coherent SR wavefront propagation simulations and present examples of such simulations performed using 'Synchrotron Radiation Workshop' (SRW) code for the parameters of hard X-ray undulator based beamlines at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory. These examples illustrate general characteristics of partially-coherent undulator radiation beams in low-emittance SR sources, and demonstrate advantages of applying high-accuracy physical-optics simulations to the optimization and performance prediction of X-ray optical beamlines in these new sources.« less

  9. Simulation model of a twin-tail, high performance airplane

    NASA Technical Reports Server (NTRS)

    Buttrill, Carey S.; Arbuckle, P. Douglas; Hoffler, Keith D.

    1992-01-01

    The mathematical model and associated computer program to simulate a twin-tailed high performance fighter airplane (McDonnell Douglas F/A-18) are described. The simulation program is written in the Advanced Continuous Simulation Language. The simulation math model includes the nonlinear six degree-of-freedom rigid-body equations, an engine model, sensors, and first order actuators with rate and position limiting. A simplified form of the F/A-18 digital control laws (version 8.3.3) are implemented. The simulated control law includes only inner loop augmentation in the up and away flight mode. The aerodynamic forces and moments are calculated from a wind-tunnel-derived database using table look-ups with linear interpolation. The aerodynamic database has an angle-of-attack range of -10 to +90 and a sideslip range of -20 to +20 degrees. The effects of elastic deformation are incorporated in a quasi-static-elastic manner. Elastic degrees of freedom are not actively simulated. In the engine model, the throttle-commanded steady-state thrust level and the dynamic response characteristics of the engine are based on airflow rate as determined from a table look-up. Afterburner dynamics are switched in at a threshold based on the engine airflow and commanded thrust.

  10. Bringing good teaching cases "to life": a simulator-based medical education service.

    PubMed

    Gordon, James A; Oriol, Nancy E; Cooper, Jeffrey B

    2004-01-01

    Realistic medical simulation has expanded worldwide over the last decade. Such technology is playing an increasing role in medical education not merely because simulator sessions are enjoyable, but because they can provide an enhanced environment for experiential learning and reflective thought. High-fidelity patient simulators allow students of all levels to "practice" medicine without risk, providing a natural framework for the integration of basic and clinical science in a safe environment. Often described as "flight simulation for doctors," the rationale, utility, and range of medical simulations have been described elsewhere, yet the challenges of integrating this technology into the medical school curriculum have received little attention. The authors report how Harvard Medical School established an on-campus simulator program for students in 2001, building on the work of the Center for Medical Simulation in Boston. As an overarching structure for the process, faculty and residents developed a simulator-based "medical education service"-like any other medical teaching service, but designed exclusively to help students learn on the simulator alongside a clinician-mentor, on demand. Initial evaluations among both preclinical and clinical students suggest that simulation is highly accepted and increasingly demanded. For some learners, simulation may allow complex information to be understood and retained more efficiently than can occur with traditional methods. Moreover, the process outlined here suggests that simulation can be integrated into existing curricula of almost any medical school or teaching hospital in an efficient and cost-effective manner.

  11. CFD Modeling of Swirl and Nonswirl Gas Injections into Liquid Baths Using Top Submerged Lances

    NASA Astrophysics Data System (ADS)

    Huda, Nazmul; Naser, J.; Brooks, G.; Reuter, M. A.; Matusewicz, R. W.

    2010-02-01

    Fluid flow phenomena in a cylindrical bath stirred by a top submerged lance (TSL) gas injection was investigated by using the computational fluid dynamic (CFD) modeling technique for an isothermal air-water system. The multiphase flow simulation, based on the Euler-Euler approach, elucidated the effect of swirl and nonswirl flow inside the bath. The effects of the lance submergence level and the air flow rate also were investigated. The simulation results for the velocity fields and the generation of turbulence in the bath were validated against existing experimental data from the previous water model experimental study by Morsi et al.[1] The model was extended to measure the degree of the splash generation for different liquid densities at certain heights above the free surface. The simulation results showed that the two-thirds lance submergence level provided better mixing and high liquid velocities for the generation of turbulence inside the water bath. However, it is also responsible for generating more splashes in the bath compared with the one-third lance submergence level. An approach generally used by heating, ventilation, and air conditioning (HVAC) system simulations was applied to predict the convective mixing phenomena. The simulation results for the air-water system showed that mean convective mixing for swirl flow is more than twice than that of nonswirl in close proximity to the lance. A semiempirical equation was proposed from the results of the present simulation to measure the vertical penetration distance of the air jet injected through the annulus of the lance in the cylindrical vessel of the model, which can be expressed as L_{va} = 0.275( {do - di } )Frm^{0.4745} . More work still needs to be done to predict the detail process kinetics in a real furnace by considering nonisothermal high-temperature systems with chemical reactions.

  12. Turbofan Engine Simulated in a Graphical Simulation Environment

    NASA Technical Reports Server (NTRS)

    Parker, Khary I.; Guo, Ten-Huei

    2004-01-01

    Recently, there has been an increase in the development of intelligent engine technology with advanced active component control. The computer engine models used in these control studies are component-level models (CLM), models that link individual component models of state space and nonlinear algebraic equations, written in a computer language such as Fortran. The difficulty faced in performing control studies on Fortran-based models is that Fortran is not supported with control design and analysis tools, so there is no means for implementing real-time control. It is desirable to have a simulation environment that is straightforward, has modular graphical components, and allows easy access to health, control, and engine parameters through a graphical user interface. Such a tool should also provide the ability to convert a control design into real-time code, helping to make it an extremely powerful tool in control and diagnostic system development. Simulation time management is shown: Mach number versus time, power level angle versus time, altitude versus time, ambient temperature change versus time, afterburner fuel flow versus time, controller and actuator dynamics, collect initial conditions, CAD output, and component-level model: CLM sensor, CAD input, and model output. The Controls and Dynamics Technologies Branch at the NASA Glenn Research Center has developed and demonstrated a flexible, generic turbofan engine simulation platform that can meet these objectives, known as the Modular Aero-Propulsion System Simulation (MAPSS). MAPSS is a Simulink-based implementation of a Fortran-based, modern high pressure ratio, dual-spool, low-bypass, military-type variable-cycle engine with a digital controller. Simulink (The Mathworks, Natick, MA) is a computer-aided control design and simulation package allows the graphical representation of dynamic systems in a block diagram form. MAPSS is a nonlinear, non-real-time system composed of controller and actuator dynamics (CAD) and component-level model (CLM) modules. The controller in the CAD module emulates the functionality of a digital controller, which has a typical update rate of 50 Hz. The CLM module simulates the dynamics of the engine components and uses an update rate of 2500 Hz, which is needed to iterate to balance mass and energy among system components. The actuators in the CAD module use the same sampling rate as those in the CLM. Two graphs of normalized spool speed versus time in seconds and one graph of normalized average metal temperature versus time in seconds is shown. MAPSS was validated via open-loop and closed-loop comparisons with the Fortran simulation. The preceding plots show the normalized results of a closed-loop comparison looking at three states of the model: low-pressure spool speed, high-pressure spool speed, and the average metal temperature measured from the combustor to the high-pressure turbine. In steady state, the error between the simulations is less than 1 percent. During a transient, the difference between the simulations is due to a correction in MAPSS that prevents the gas flow in the bypass duct inlet from flowing forward instead of toward the aft end, which occurs in the Fortran simulation. A comparison between MAPSS and the Fortran model of the bypass duct inlet flow for power lever angles greater than 35 degrees is shown.

  13. A near uniform basin-wide sea level fluctuation over the Japan/East Sea: A semienclosed sea with multiple straits

    NASA Astrophysics Data System (ADS)

    Kim, Seung-Bum; Fukumori, Ichiro

    2008-06-01

    Sea level of the Japan/East Sea observed by the TOPEX/Poseidon (T/P) satellite altimeter is analyzed using a 1/4°-resolution ocean general circulation model. A significant fraction of the Japan/East Sea sea level variability is found to be spatially uniform with periods ranging from 20 d to a year. The model simulation is consistent with T/P records in terms of the basin-wide sea level fluctuation's spectral energy and coherence. The simulation indicates that the changes are barotropic in nature and controlled, notably at high frequencies, by the net mass transport through the straits of the Japan/East Sea driven by winds in the vicinity of the Korea/Tsushima and Soya Straits. A series of barotropic simulations suggest that the sea level fluctuations are the result of a dynamic balance at the straits among near-strait winds, friction, and geostrophic control. The basin-wide sea level response is a linear superposition of changes due to winds near the individual straits. In particular, a basin-wide sea level response can be established by winds near either one of the straits alone. For the specific geometry and winds, winds near the Soya Strait have a larger impact on the Japan/East Sea mean sea level than those near the Korea/Tsushima Strait.

  14. A comparison between electromechanical and pneumatic-controlled knee simulators for the investigation of wear of total knee replacements.

    PubMed

    Abdelgaied, Abdellatif; Fisher, John; Jennings, Louise M

    2017-07-01

    More robust preclinical experimental wear simulation methods are required in order to simulate a wider range of activities, observed in different patient populations such as younger more active patients, as well as to fully meet and be capable of going well beyond the existing requirements of the relevant international standards. A new six-station electromechanically driven simulator (Simulation Solutions, UK) with five fully independently controlled axes of articulation for each station, capable of replicating deep knee bending as well as other adverse conditions, which can be operated in either force or displacement control with improved input kinematic following, has been developed to meet these requirements. This study investigated the wear of a fixed-bearing total knee replacement using this electromechanically driven fully independent knee simulator and compared it to previous data from a predominantly pneumatically controlled simulator in which each station was not fully independently controlled. In addition, the kinematic performance and the repeatability of the simulators have been investigated and compared to the international standard requirements. The wear rates from the electromechanical and pneumatic knee simulators were not significantly different, with wear rates of 2.6 ± 0.9 and 2.7 ± 0.9 mm 3 /million cycles (MC; mean ± 95% confidence interval, p = 0.99) and 5.4 ± 1.4 and 6.7 ± 1.5 mm 3 /MC (mean ± 95 confidence interval, p = 0.54) from the electromechanical and pneumatic simulators under intermediate levels (maximum 5 mm) and high levels (maximum 10 mm) of anterior-posterior displacements, respectively. However, the output kinematic profiles of the control system, which drive the motion of the simulator, followed the input kinematic profiles more closely on the electromechanical simulator than the pneumatic simulator. In addition, the electromechanical simulator was capable of following kinematic and loading input cycles within the tolerances of the international standard requirements (ISO 14243-3). The new-generation electromechanical knee simulator with fully independent control has the potential to be used for a much wider range of kinematic conditions, including high-flexion and other severe conditions, due to its improved capability and performance in comparison to the previously used pneumatic-controlled simulators.

  15. A comparison between electromechanical and pneumatic-controlled knee simulators for the investigation of wear of total knee replacements

    PubMed Central

    Abdelgaied, Abdellatif; Fisher, John; Jennings, Louise M

    2017-01-01

    More robust preclinical experimental wear simulation methods are required in order to simulate a wider range of activities, observed in different patient populations such as younger more active patients, as well as to fully meet and be capable of going well beyond the existing requirements of the relevant international standards. A new six-station electromechanically driven simulator (Simulation Solutions, UK) with five fully independently controlled axes of articulation for each station, capable of replicating deep knee bending as well as other adverse conditions, which can be operated in either force or displacement control with improved input kinematic following, has been developed to meet these requirements. This study investigated the wear of a fixed-bearing total knee replacement using this electromechanically driven fully independent knee simulator and compared it to previous data from a predominantly pneumatically controlled simulator in which each station was not fully independently controlled. In addition, the kinematic performance and the repeatability of the simulators have been investigated and compared to the international standard requirements. The wear rates from the electromechanical and pneumatic knee simulators were not significantly different, with wear rates of 2.6 ± 0.9 and 2.7 ± 0.9 mm3/million cycles (MC; mean ± 95% confidence interval, p = 0.99) and 5.4 ± 1.4 and 6.7 ± 1.5 mm3/MC (mean ± 95 confidence interval, p = 0.54) from the electromechanical and pneumatic simulators under intermediate levels (maximum 5 mm) and high levels (maximum 10 mm) of anterior–posterior displacements, respectively. However, the output kinematic profiles of the control system, which drive the motion of the simulator, followed the input kinematic profiles more closely on the electromechanical simulator than the pneumatic simulator. In addition, the electromechanical simulator was capable of following kinematic and loading input cycles within the tolerances of the international standard requirements (ISO 14243-3). The new-generation electromechanical knee simulator with fully independent control has the potential to be used for a much wider range of kinematic conditions, including high-flexion and other severe conditions, due to its improved capability and performance in comparison to the previously used pneumatic-controlled simulators. PMID:28661228

  16. Simulation for learning and teaching procedural skills: the state of the science.

    PubMed

    Nestel, Debra; Groom, Jeffrey; Eikeland-Husebø, Sissel; O'Donnell, John M

    2011-08-01

    Simulation is increasingly used to support learning of procedural skills. Our panel was tasked with summarizing the "best evidence." We addressed the following question: To what extent does simulation support learning and teaching in procedural skills? We conducted a literature search from 2000 to 2010 using Medline, CINAHL, ERIC, and PSYCHINFO databases. Inclusion criteria were established and then data extracted from abstracts according to several categories. Although secondary sources of literature were sourced from key informants and participants at the "Research Consensus Summit: State of the Science," they were not included in the data extraction process but were used to inform discussion. Eighty-one of 1,575 abstracts met inclusion criteria. The uses of simulation for learning and teaching procedural skills were diverse. The most commonly reported simulator type was manikins (n = 17), followed by simulated patients (n = 14), anatomic simulators (eg, part-task) (n = 12), and others. For research design, most abstracts (n = 52) were at Level IV of the National Health and Medical Research Council classification (ie, case series, posttest, or pretest/posttest, with no control group, narrative reviews, and editorials). The most frequent Best Evidence Medical Education ranking was for conclusions probable (n = 37). Using the modified Kirkpatrick scale for impact of educational intervention, the most frequent classification was for modification of knowledge and/or skills (Level 2b) (n = 52). Abstracts assessed skills (n = 47), knowledge (n = 32), and attitude (n = 15) with the majority demonstrating improvements after simulation-based interventions. Studies focused on immediate gains and skills assessments were usually conducted in simulation. The current state of the science finds that simulation usually leads to improved knowledge and skills. Learners and instructors express high levels of satisfaction with the method. While most studies focus on short-term gains attained in the simulation setting, a small number support the transfer of simulation learning to clinical practice. Further study is needed to optimize the alignment of learner, instructor, simulator, setting, and simulation for learning and teaching procedural skills. Instructional design and educational theory, contextualization, transferability, accessibility, and scalability must all be considered in simulation-based education programs. More consistently, robust research designs are required to strengthen the evidence.

  17. Development of an Integrated Process, Modeling and Simulation Platform for Performance-Based Design of Low-Energy and High IEQ Buildings

    ERIC Educational Resources Information Center

    Chen, Yixing

    2013-01-01

    The objective of this study was to develop a "Virtual Design Studio (VDS)": a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. The VDS is intended to assist collaborating architects,…

  18. The use of real-time, hardware-in-the-loop simulation in the design and development of the new Hughes HS601 spacecraft attitude control system

    NASA Technical Reports Server (NTRS)

    Slafer, Loren I.

    1989-01-01

    Realtime simulation and hardware-in-the-loop testing is being used extensively in all phases of the design, development, and testing of the attitude control system (ACS) for the new Hughes HS601 satellite bus. Realtime, hardware-in-the-loop simulation, integrated with traditional analysis and pure simulation activities is shown to provide a highly efficient and productive overall development program. Implementation of high fidelity simulations of the satellite dynamics and control system algorithms, capable of real-time execution (using applied Dynamics International's System 100), provides a tool which is capable of being integrated with the critical flight microprocessor to create a mixed simulation test (MST). The MST creates a highly accurate, detailed simulated on-orbit test environment, capable of open and closed loop ACS testing, in which the ACS design can be validated. The MST is shown to provide a valuable extension of traditional test methods. A description of the MST configuration is presented, including the spacecraft dynamics simulation model, sensor and actuator emulators, and the test support system. Overall system performance parameters are presented. MST applications are discussed; supporting ACS design, developing on-orbit system performance predictions, flight software development and qualification testing (augmenting the traditional software-based testing), mission planning, and a cost-effective subsystem-level acceptance test. The MST is shown to provide an ideal tool in which the ACS designer can fly the spacecraft on the ground.

  19. Millimeter-Wave Wireless Power Transfer Technology for Space Applications

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Goutam; Manohara, Harish; Mojarradi, Mohammad M.; Vo, Tuan A.; Mojarradi, Hadi; Bae, Sam Y.; Marzwell, Neville

    2008-01-01

    In this paper we present a new compact, scalable, and low cost technology for efficient receiving of power using RF waves at 94 GHz. This technology employs a highly innovative array of slot antennas that is integrated on substrate composed of gold (Au), silicon (Si), and silicon dioxide (SiO2) layers. The length of the slots and spacing between them are optimized for a highly efficient beam through a 3-D electromagnetic simulation process. Antenna simulation results shows a good beam profile with very low side lobe levels and better than 93% antenna efficiency.

  20. A Prototype Scintillating-Fibre Tracker for the Cosmic-ray Muon Tomography of Legacy Nuclear Waste Containers

    NASA Astrophysics Data System (ADS)

    Kaiser, R.; Clarkson, A.; Hamilton, D. J.; Hoek, M.; Ireland, D. G.; Johnston, J. R.; Keri, T.; Lumsden, S.; Mahon, D. F.; McKinnon, B.; Murray, M.; Nutbeam-Tuffs, S.; Shearer, C.; Staines, C.; Yang, G.; Zimmerman, C.

    2014-03-01

    Cosmic-ray muons are highly-penetrative charged particles observed at sea level with a flux of approximately 1 cm-2 min-1. They interact with matter primarily through Coulomb scattering which can be exploited in muon tomography to image objects within industrial nuclear waste containers. This paper presents the prototype scintillating-fibre detector developed for this application at the University of Glasgow. Experimental results taken with test objects are shown in comparison to results from GEANT4 simulations. These results verify the simulation and show discrimination between the low, medium and high-Z materials imaged.

  1. Ablation from High Velocity Clouds: A Source for Low Velocity Ionized Gas

    NASA Astrophysics Data System (ADS)

    Shelton, Robin L.; Henley, D. B.; Kwak, K.

    2012-05-01

    High velocity clouds shed material as they move through the Galaxy. This material mixes with the Galactic interstellar medium, resulting in plasma whose temperature and ionization levels are intermediate between those of the cloud and those of the Galaxy. As time passes, the mixed material slows to the velocity of the ambient gas. This raises the possibility that initially warm (T 10^3 K), poorly ionized clouds moving through hot (T 10^6 K), very highly ionized ambient gas could lead to mixed gas that harbors significant numbers of high ions (O+5, N+4, and C+3) and thus helps to explain the large numbers of low-velocity high ions seen on high latitude lines of sight through the Galactic halo. We have used a series of detailed FLASH simulations in order to track the hydrodynamics of warm clouds embedded in hot Galactic halo gas. These simulations tracked the ablated material as it mixed and slowed to low velocities. By following the ionization levels of the gas in a time-dependent fashion, we determined that the mixed material is rich in O+5, N+4, and C+3 ions and continues to contain these ions for some time after slowing to low velocities. Combining our simulational results with estimates of the high velocity cloud infall rate leads to the finding that the mixed gas can account for 1/3 of the normal-velocity O+5 column density found on high latitude lines of sight. It accounts for lesser fractions of the N+4 and C+3 column densities. We will discuss our high velocity cloud results as part of a composite halo model that also includes cooling Galactic fountain gas, isolated supernova remnants, and ionizing photons.

  2. Nonlinear gyrokinetic simulations of the I-mode high confinement regime and comparisons with experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, A. E., E-mail: whitea@mit.edu; Howard, N. T.; Creely, A. J.

    2015-05-15

    For the first time, nonlinear gyrokinetic simulations of I-mode plasmas are performed and compared with experiment. I-mode is a high confinement regime, featuring energy confinement similar to H-mode, but without enhanced particle and impurity particle confinement [D. G. Whyte et al., Nucl. Fusion 50, 105005 (2010)]. As a consequence of the separation between heat and particle transport, I-mode exhibits several favorable characteristics compared to H-mode. The nonlinear gyrokinetic code GYRO [J. Candy and R. E. Waltz, J Comput. Phys. 186, 545 (2003)] is used to explore the effects of E × B shear and profile stiffness in I-mode and comparemore » with L-mode. The nonlinear GYRO simulations show that I-mode core ion temperature and electron temperature profiles are more stiff than L-mode core plasmas. Scans of the input E × B shear in GYRO simulations show that E × B shearing of turbulence is a stronger effect in the core of I-mode than L-mode. The nonlinear simulations match the observed reductions in long wavelength density fluctuation levels across the L-I transition but underestimate the reduction of long wavelength electron temperature fluctuation levels. The comparisons between experiment and gyrokinetic simulations for I-mode suggest that increased E × B shearing of turbulence combined with increased profile stiffness are responsible for the reductions in core turbulence observed in the experiment, and that I-mode resembles H-mode plasmas more than L-mode plasmas with regards to marginal stability and temperature profile stiffness.« less

  3. The Oceanographic Multipurpose Software Environment (OMUSE v1.0)

    NASA Astrophysics Data System (ADS)

    Pelupessy, Inti; van Werkhoven, Ben; van Elteren, Arjen; Viebahn, Jan; Candy, Adam; Portegies Zwart, Simon; Dijkstra, Henk

    2017-08-01

    In this paper we present the Oceanographic Multipurpose Software Environment (OMUSE). OMUSE aims to provide a homogeneous environment for existing or newly developed numerical ocean simulation codes, simplifying their use and deployment. In this way, numerical experiments that combine ocean models representing different physics or spanning different ranges of physical scales can be easily designed. Rapid development of simulation models is made possible through the creation of simple high-level scripts. The low-level core of the abstraction in OMUSE is designed to deploy these simulations efficiently on heterogeneous high-performance computing resources. Cross-verification of simulation models with different codes and numerical methods is facilitated by the unified interface that OMUSE provides. Reproducibility in numerical experiments is fostered by allowing complex numerical experiments to be expressed in portable scripts that conform to a common OMUSE interface. Here, we present the design of OMUSE as well as the modules and model components currently included, which range from a simple conceptual quasi-geostrophic solver to the global circulation model POP (Parallel Ocean Program). The uniform access to the codes' simulation state and the extensive automation of data transfer and conversion operations aids the implementation of model couplings. We discuss the types of couplings that can be implemented using OMUSE. We also present example applications that demonstrate the straightforward model initialization and the concurrent use of data analysis tools on a running model. We give examples of multiscale and multiphysics simulations by embedding a regional ocean model into a global ocean model and by coupling a surface wave propagation model with a coastal circulation model.

  4. Organizational culture shapes the adoption and incorporation of simulation into nursing curricula: a grounded theory study.

    PubMed

    Taplay, Karyn; Jack, Susan M; Baxter, Pamela; Eva, Kevin; Martin, Lynn

    2014-01-01

    Purpose. To create a substantive mid-range theory explaining how the organizational cultures of undergraduate nursing programs shape the adoption and incorporation of mid-to high-level technical fidelity simulators as a teaching strategy within curricula. Method. A constructivist grounded theory was used to guide this study which was conducted in Ontario, Canada, during 2011-12. Semistructured interviews (n = 43) with participants that included nursing administrators, nursing faculty, and simulation leaders across multiple programs (n = 13) informed this study. Additionally, key documents (n = 67) were reviewed. Purposeful and theoretical sampling was used and data were collected and analyzed simultaneously. Data were compared among and between sites. Findings. The organizational elements that shape simulation in nursing (OESSN) model depicts five key organizational factors at the nursing program level that shaped the adoption and incorporation of simulation: (1) leaders working in tandem, (2) information exchange, (3) physical locale, (4) shared motivators, and (5) scaffolding to manage change. Conclusions. The OESSN model provides an explanation of the organizational factors that contributed to the adoption and incorporation of simulation into nursing curricula. Nursing programs that use the OESSN model may experience a more rapid or broad uptake of simulation when organizational factors that impact adoption and incorporation are considered and planned for.

  5. Organizational Culture Shapes the Adoption and Incorporation of Simulation into Nursing Curricula: A Grounded Theory Study

    PubMed Central

    Jack, Susan M.; Eva, Kevin; Martin, Lynn

    2014-01-01

    Purpose. To create a substantive mid-range theory explaining how the organizational cultures of undergraduate nursing programs shape the adoption and incorporation of mid-to high-level technical fidelity simulators as a teaching strategy within curricula. Method. A constructivist grounded theory was used to guide this study which was conducted in Ontario, Canada, during 2011-12. Semistructured interviews (n = 43) with participants that included nursing administrators, nursing faculty, and simulation leaders across multiple programs (n = 13) informed this study. Additionally, key documents (n = 67) were reviewed. Purposeful and theoretical sampling was used and data were collected and analyzed simultaneously. Data were compared among and between sites. Findings. The organizational elements that shape simulation in nursing (OESSN) model depicts five key organizational factors at the nursing program level that shaped the adoption and incorporation of simulation: (1) leaders working in tandem, (2) information exchange, (3) physical locale, (4) shared motivators, and (5) scaffolding to manage change. Conclusions. The OESSN model provides an explanation of the organizational factors that contributed to the adoption and incorporation of simulation into nursing curricula. Nursing programs that use the OESSN model may experience a more rapid or broad uptake of simulation when organizational factors that impact adoption and incorporation are considered and planned for. PMID:24818018

  6. Multiple Point Statistics algorithm based on direct sampling and multi-resolution images

    NASA Astrophysics Data System (ADS)

    Julien, S.; Renard, P.; Chugunova, T.

    2017-12-01

    Multiple Point Statistics (MPS) has become popular for more than one decade in Earth Sciences, because these methods allow to generate random fields reproducing highly complex spatial features given in a conceptual model, the training image, while classical geostatistics techniques based on bi-point statistics (covariance or variogram) fail to generate realistic models. Among MPS methods, the direct sampling consists in borrowing patterns from the training image to populate a simulation grid. This latter is sequentially filled by visiting each of these nodes in a random order, and then the patterns, whose the number of nodes is fixed, become narrower during the simulation process, as the simulation grid is more densely informed. Hence, large scale structures are caught in the beginning of the simulation and small scale ones in the end. However, MPS may mix spatial characteristics distinguishable at different scales in the training image, and then loose the spatial arrangement of different structures. To overcome this limitation, we propose to perform MPS simulation using a decomposition of the training image in a set of images at multiple resolutions. Applying a Gaussian kernel onto the training image (convolution) results in a lower resolution image, and iterating this process, a pyramid of images depicting fewer details at each level is built, as it can be done in image processing for example to lighten the space storage of a photography. The direct sampling is then employed to simulate the lowest resolution level, and then to simulate each level, up to the finest resolution, conditioned to the level one rank coarser. This scheme helps reproduce the spatial structures at any scale of the training image and then generate more realistic models. We illustrate the method with aerial photographies (satellite images) and natural textures. Indeed, these kinds of images often display typical structures at different scales and are well-suited for MPS simulation techniques.

  7. Experience with a three-axis side-located controller during a static and centrifuge simulation of the piloted launch of a manned multistage vehicle

    NASA Technical Reports Server (NTRS)

    Andrews, William H.; Holleman, Euclid C.

    1960-01-01

    An investigation was conducted to determine a human pilot's ability to control a multistage vehicle through the launch trajectory. The simulation was performed statically and dynamically by utilizing a human centrifuge. An interesting byproduct of the program was the three-axis side-located controller incorporated for pilot control inputs. This method of control proved to be acceptable for the successful completion of the tracking task during the simulation. There was no apparent effect of acceleration on the mechanical operation of the controller, but the pilot's control feel deteriorated as his dexterity decreased at high levels of acceleration. The application of control in a specific control mode was not difficult. However, coordination of more than one mode was difficult, and, in many instances, resulted in inadvertent control inputs. The acceptable control harmony at an acceleration level of 1 g became unacceptable at higher acceleration levels. Proper control-force harmony for a particular control task appears to be more critical for a three-axis controller than for conventional controllers. During simulations in which the pilot wore a pressure suit, the nature of the suit gloves further aggravated this condition.

  8. Examining Reuse in LaSRS++-Based Projects

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.

    2001-01-01

    NASA Langley Research Center (LaRC) developed the Langley Standard Real-Time Simulation in C++ (LaSRS++) to consolidate all software development for its simulation facilities under one common framework. A common framework promised a decrease in the total development effort for a new simulation by encouraging software reuse. To judge the success of LaSRS++ in this regard, reuse metrics were extracted from 11 aircraft models. Three methods that employ static analysis of the code were used to identify the reusable components. For the method that provides the best estimate, reuse levels fall between 66% and 95% indicating a high degree of reuse. Additional metrics provide insight into the extent of the foundation that LaSRS++ provides to new simulation projects. When creating variants of an aircraft, LaRC developers use object-oriented design to manage the aircraft as a reusable resource. Variants modify the aircraft for a research project or embody an alternate configuration of the aircraft. The variants inherit from the aircraft model. The variants use polymorphism to extend or redefine aircraft behaviors to meet the research requirements or to match the alternate configuration. Reuse level metrics were extracted from 10 variants. Reuse levels of aircraft by variants were 60% - 99%.

  9. Practice Makes Perfect: Correlations Between Prior Experience in High-level Athletics and Robotic Surgical Performance Do Not Persist After Task Repetition.

    PubMed

    Shee, Kevin; Ghali, Fady M; Hyams, Elias S

    Robotic surgical skill development is central to training in urology as well as in other surgical disciplines. Here, we describe a pilot study assessing the relationships between robotic surgery simulator performance and 3 categories of activities, namely, videogames, musical instruments, and athletics. A questionnaire was administered to preclinical medical students for general demographic information and prior experiences in surgery, videogames, musical instruments, and athletics. For follow-up performance studies, we used the Matchboard Level 1 and 2 modules on the da Vinci Skills Simulator, and recorded overall score, time to complete, economy of motion, workspace range, instrument collisions, instruments out of view, and drops. Task 1 was run once, whereas task 2 was run 3 times. All performance studies on the da Vinci Surgical Skills Simulator took place in the Simulation Center at Dartmouth-Hitchcock Medical Center. All participants were medical students at the Geisel School of Medicine. After excluding students with prior hands-on experience in surgery, a total of 30 students completed the study. We found a significant correlation between athletic skill level and performance for both task 1 (p = 0.0002) and task 2 (p = 0.0009). No significant correlations were found for videogame or musical instrument skill level. Students with experience in certain athletics (e.g., volleyball, tennis, and baseball) tended to perform better than students with experience in other athletics (e.g., track and field). For task 2, which was run 3 times, this association did not persist after the third repetition due to significant improvements in students with low-level athletic skill (levels 0-2). Our study suggests that prior experience in high-level athletics, but not videogames or musical instruments, significantly influences surgical proficiency in robot-naive students. Furthermore, our study suggests that practice through task repetition can overcome initial differences that may be related to a background in athletics. These novel relationships may have broader implications for the future recruitment and training of robotic surgeons and may warrant further investigation. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. Experimental investigation of control/display augmentation effects in a compensatory tracking task

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay; Schmidt, David K.

    1988-01-01

    The effects of control/display augmentation on human performance and workload have been investigated for closed-loop, continuous-tracking tasks by a real-time, man-in-the-loop simulation study. The experimental results obtained indicate that only limited improvement in actual tracking performance is obtainable through display augmentation alone; with a very high level of display augmentation, tracking error will actually deteriorate. Tracking performance improves when status information is furnished for reasonable levels of display quickening; again, very high quickening levels lead to tracking error deterioration due to the incompatibility between the status information and the quickened signal.

  11. Astronomical Near-neighbor Detection with a Four-quadrant Phase Mask (FQPM) Coronagraph

    NASA Technical Reports Server (NTRS)

    Haguenauer, Pierre; Serabyn, Eugene; Mennesson, Bertrand; Wallace, James K.; Gappinger, Robert O.; Troy, Mitchell; Bloemhof, Eric E.; Moore, Jim; Koresko, Chris D.

    2006-01-01

    Direct detection of planets around nearby stars requires the development of high-contrast imaging techniques, because of their very different respective fluxes. We thus investigated the innovative coronagraphic approach based on the use of a four-quadrant phase mask (FQPM). Simulations showed that, combined with high-level wavefront correction on an unobscured off-axis section of a large telescope, this method allows high-contrast imaging very close to stars, with detection capability superior to that of a traditional coronagraph. A FQPM instrument was thus built to test the feasibility of near-neighbor observations with our new off-axis approach on a ground-based telescope. In June 2005, we deployed our instrument to the Palomar 200-inch telescope, using existing facilities as much as possible for rapid implementation. In these initial observations, using data processing techniques specific to FQPM coronagraphs, we reached extinction levels of the order of 200:1. Here we discuss our simulations and on-sky results obtained so far.

  12. Effect of stable stratification on dispersion within urban street canyons: A large-eddy simulation

    NASA Astrophysics Data System (ADS)

    Li, Xian-Xiang; Britter, Rex; Norford, Leslie K.

    2016-11-01

    This study employs a validated large-eddy simulation (LES) code with high tempo-spatial resolution to investigate the effect of a stably stratified roughness sublayer (RSL) on scalar transport within an urban street canyon. The major effect of stable stratification on the flow and turbulence inside the street canyon is that the flow slows down in both streamwise and vertical directions, a stagnant area near the street level emerges, and the vertical transport of momentum is weakened. Consequently, the transfer of heat between the street canyon and overlying atmosphere also gets weaker. The pollutant emitted from the street level 'pools' within the lower street canyon, and more pollutant accumulates within the street canyon with increasing stability. Under stable stratification, the dominant mechanism for pollutant transport within the street canyon has changed from ejections (flow carries high-concentration pollutant upward) to unorganized motions (flow carries high-concentration pollutant downward), which is responsible for the much lower dispersion efficiency under stable stratifications.

  13. High-Fidelity Manikin-Based Simulation: A Study of Implications for Interprofessional Healthcare Practitioner Education at the Associate Degree Level of Study

    ERIC Educational Resources Information Center

    Fowler, Luster

    2013-01-01

    Healthcare practitioner training programs, specifically at the associate degree level of study, have historically focused practitioner training efforts on discipline-specific programming and curricula. However, these institutions have now begun to examine the utility and efficacy of incorporating interprofessional experiences into their programs.…

  14. Fate and Transport Modeling of Selected Chlorinated Organic Compounds at Operable Unit 1, U.S. Naval Air Station, Jacksonville, Florida

    USGS Publications Warehouse

    Davis, J. Hal

    2007-01-01

    The U.S. Naval Air Station occupies 3,800 acres adjacent to the St. Johns River in Jacksonville, Florida. The Station was placed on the U.S. Environmental Protection Agency's National Priorities List in December 1989 and is participating in the U.S. Department of Defense Installation Restoration Program, which serves to identify and remediate environmental contamination. One contaminated site, the old landfill, was designated as Operable Unit 1 (OU1) in 1989. The major source of ground-water contamination was from the disposal of waste oil and solvents into open pits, which began in the 1940s. Several remedial measures were implemented at this site to prevent the spread of contamination. Recovery trenches were installed in 1995 to collect free product. In 1998, some of the contamination was consolidated to the center of the old landfill and covered by an impermeable cap. Currently, Operable Unit 1 is being reevaluated as part of a 5-year review process to determine if the remedial actions were effective. Solute transport modeling indicated that the concentration of contaminants would have reached its maximum extent by the 1970s, after which the concentration levels would have generally declined because the pits would have ceased releasing high levels of contaminants. In the southern part of the site, monitoring well MW-19, which had some of the highest levels of contamination, showed decreases for measured and simulated concentrations of trichloroethene (TCE) and dichloroethene (DCE) from 1992 to present. Two upgradient disposal pits were simulated to have ceased releasing high levels of contamination in 1979, which consequently caused a drop in simulated concentrations. Monitoring well MW-100 had the highest levels of contamination of any well directly adjacent to a creek. Solute transport modeling substantially overestimated the concentrations of TCE, DCE, and vinyl chloride (VC) in this well. The reason for this overestimation is not clear, however, it indicates that the model will be conservative when used to predict concentration levels and the time required for the contamination to move through the system. Monitoring well MW-97 had the highest levels of contamination in the central part of the site. The levels decreased for both the measured and simulated values of TCE, DCE, and VC from 1999 to present. Simulating the source area as ceasing to release high levels of contamination in 1979 caused the drop in concentration, which began in the 1990s at this well. Monitoring well MW-89 had the highest levels of contamination in the northern part of the site. In order to match the low levels of contamination in wells MW-12 and MW-93, the pit was simulated as ceasing to release contamination in 1970; however, the installation of a trench in 1995 could have caused the source area to release additional contamination from 1995 to 1998. The effect of the additional dissolution was a spike in contamination at MW-89, beginning in about 1996 and continuing until the present time. Results from the last several sampling events indicate that the TCE and DCE levels could be decreasing, but VC shows no apparent trend. Several more years of sampling are needed to determine if these trends are continuing. Based on the solute transport modeling predictions, TCE, DCE, and VC will have migrated to the vicinity of creeks that drain ground water from the aquifer by 2010, and only relatively low levels will remain in the aquifer by 2015. Because the creeks represent the point where the contaminated ground water comes into contact with the environment, future contamination levels are a concern. The concentration of chlorinated solvents in the creek water has always been relatively low. Because the model shows that concentrations of TCE, DCE, and VC are declining in the aquifer, contamination levels in the creeks also are anticipated to decline.

  15. Simulation Data Management - Requirements and Design Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clay, Robert L.; Friedman-Hill, Ernest J.; Gibson, Marcus J.

    Simulation Data Management (SDM), the ability to securely organize, archive, and share analysis models and the artifacts used to create them, is a fundamental requirement for modern engineering analysis based on computational simulation. We have worked separately to provide secure, network SDM services to engineers and scientists at our respective laboratories for over a decade. We propose to leverage our experience and lessons learned to help develop and deploy a next-generation SDM service as part of a multi-laboratory team. This service will be portable across multiple sites and platforms, and will be accessible via a range of command-line tools andmore » well-documented APIs. In this document, we’ll review our high-level and low-level requirements for such a system, review one existing system, and briefly discuss our proposed implementation.« less

  16. Raptor responses to low-level jet aircraft and sonic booms

    USGS Publications Warehouse

    Ellis, David H.; Ellis, Catherine H.; Mindell, David P.

    1991-01-01

    We estimated effects of low-level military jet aircraft and mid- to high-altitude sonic booms (actual and simulated) on nesting peregrine falcons (Falco peregrinus) and seven other raptors by observing their responses to test stimuli, determining nesting success for the test year, and evaluating site reoccupancy rates for the year following the tests. Frequent and nearby jet aircraft passes: (1) sometimes noticeably alarmed birds, (2) occasionally caused birds to fly from perches or eyries, (3) most often evoked only minimal responses, and (4) were never associated with reproductive failure. Similarly, responses to real and simulated mid- to high-altitude sonic booms were often minimal and never appeared productivity limiting. Eighteen (95%) of 19 nest sites subjected to low-level jet flights and/or simulated sonic booms in 1980 fledged young during that year. Eighteen (95%) of 19 sites disturbed in 1980 were reoccupied by pairs or lone birds of the same species in 1981.We subjected four pairs of prairie falcons (Falco mexicanus) to low-level aircraft at ad libitum levels during the courtship and incubation phases when adults were most likely to abandon: all four eyries fledged young. From heart rate (HR) data taken via a telemetering egg at another prairie falcon eyrie, we determined that stimulus-induced HR alterations were comparable to rate changes for birds settling to incubate following flight.While encouraging, our findings cannot be taken as conclusive evidence that jet flights and/or sonic booms will have no long-term negative effects for other raptor species or for other areas. In addition, we did not experiment with totally naive wild adults, rotary-winged aircraft, or low-level sonic booms.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bachan, John

    Chisel is a new open-source hardware construction language developed at UC Berkeley that supports advanced hardware design using highly parameterized generators and layered domain-specific hardware languages. Chisel is embedded in the Scala programming language, which raises the level of hardware design abstraction by providing concepts including object orientation, functional programming, parameterized types, and type inference. From the same source, Chisel can generate a high-speed C++-based cycle-accurate software simulator, or low-level Verilog designed to pass on to standard ASIC or FPGA tools for synthesis and place and route.

  18. IONSIV(R) IE-911 Performance in Savannah River Site Radioactive Waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, D.D.

    2001-06-04

    This report describes cesium sorption from high-level radioactive waste solutions onto IONSIV(R) IE-911 at ambient temperature. Researchers characterized six radioactive waste samples from five high-level waste tanks in the Savannah River Site tank farm, diluted the wastes to 5.6 M Na+, and made equilibrium and kinetic measurements of cesium sorption. The equilibrium measurements were compared to ZAM (Zheng, Anthony, and Martin) model predictions. The kinetic measurements were compared to simulant solutions whose column performance has been measured.

  19. Uncertainty and feasibility of dynamical downscaling for modeling tropical cyclones for storm surge simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Zhaoqing; Taraphdar, Sourav; Wang, Taiping

    This paper presents a modeling study conducted to evaluate the uncertainty of a regional model in simulating hurricane wind and pressure fields, and the feasibility of driving coastal storm surge simulation using an ensemble of region model outputs produced by 18 combinations of three convection schemes and six microphysics parameterizations, using Hurricane Katrina as a test case. Simulated wind and pressure fields were compared to observed H*Wind data for Hurricane Katrina and simulated storm surge was compared to observed high-water marks on the northern coast of the Gulf of Mexico. The ensemble modeling analysis demonstrated that the regional model wasmore » able to reproduce the characteristics of Hurricane Katrina with reasonable accuracy and can be used to drive the coastal ocean model for simulating coastal storm surge. Results indicated that the regional model is sensitive to both convection and microphysics parameterizations that simulate moist processes closely linked to the tropical cyclone dynamics that influence hurricane development and intensification. The Zhang and McFarlane (ZM) convection scheme and the Lim and Hong (WDM6) microphysics parameterization are the most skillful in simulating Hurricane Katrina maximum wind speed and central pressure, among the three convection and the six microphysics parameterizations. Error statistics of simulated maximum water levels were calculated for a baseline simulation with H*Wind forcing and the 18 ensemble simulations driven by the regional model outputs. The storm surge model produced the overall best results in simulating the maximum water levels using wind and pressure fields generated with the ZM convection scheme and the WDM6 microphysics parameterization.« less

  20. Variability in the Use of Simulation for Procedural Training in Radiology Residency: Opportunities for Improvement.

    PubMed

    Matalon, Shanna A; Chikarmane, Sona A; Yeh, Eren D; Smith, Stacy E; Mayo-Smith, William W; Giess, Catherine S

    2018-03-19

    Increased attention to quality and safety has led to a re-evaluation of the classic apprenticeship model for procedural training. Many have proposed simulation as a supplementary teaching tool. The purpose of this study was to assess radiology resident exposure to procedural training and procedural simulation. An IRB-exempt online survey was distributed to current radiology residents in the United States by e-mail. Survey results were summarized using frequency and percentages. Chi-square tests were used for statistical analysis where appropriate. A total of 353 current residents completed the survey. 37% (n = 129/353) of respondents had never used procedure simulation. Of the residents who had used simulation, most did not do so until after having already performed procedures on patients (59%, n = 132/223). The presence of a dedicated simulation center was reported by over half of residents (56%, n = 196/353) and was associated with prior simulation experience (P = 0.007). Residents who had not had procedural simulation were somewhat likely or highly likely (3 and 4 on a 4-point Likert-scale) to participate if it were available (81%, n = 104/129). Simulation training was associated with higher comfort levels in performing procedures (P < 0.001). Although procedural simulation training is associated with higher comfort levels when performing procedures, there is variable use in radiology resident training and its use is not currently optimized. Given the increased emphasis on patient safety, these results suggest the need to increase procedural simulation use during residency, including an earlier introduction to simulation before patient exposure. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Lessons for continuing medical education from simulation research in undergraduate and graduate medical education: effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines.

    PubMed

    McGaghie, William C; Siddall, Viva J; Mazmanian, Paul E; Myers, Janet

    2009-03-01

    Simulation technology is widely used in undergraduate and graduate medical education as well as for personnel training and evaluation in other healthcare professions. Simulation provides safe and effective opportunities for learners at all levels to practice and acquire clinical skills needed for patient care. A growing body of research evidence documents the utility of simulation technology for educating healthcare professionals. However, simulation has not been widely endorsed or used for continuing medical education (CME). This article reviews and evaluates evidence from studies on simulation technology in undergraduate and graduate medical education and addresses its implications for CME. The Agency for Healthcare Research and Quality Evidence Report suggests that simulation training is effective, especially for psychomotor and communication skills, but that the strength of the evidence is low. In another review, the Best Evidence Medical Education collaboration supported the use of simulation technology, focusing on high-fidelity medical simulations under specific conditions. Other studies enumerate best practices that include mastery learning, deliberate practice, and recognition and attention to cultural barriers within the medical profession that present obstacles to wider use of this technology. Simulation technology is a powerful tool for the education of physicians and other healthcare professionals at all levels. Its educational effectiveness depends on informed use for trainees, including providing feedback, engaging learners in deliberate practice, integrating simulation into an overall curriculum, as well as on the instruction and competence of faculty in its use. Medical simulation complements, but does not replace, educational activities based on real patient-care experiences.

  2. Mesoscale modeling of smoke transport over Central Africa: influences of trade winds, subtropical high, ITCZ and vertical statistics

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Wang, J.; Hyer, E. J.; Ichoku, C. M.

    2012-12-01

    A fully-coupled meteorology-chemistry-aerosol model, Weather Research and Forecasting model with Chemistry (WRF-Chem), is used to simulate the transport of smoke aerosol over the Central Africa during February 2008. Smoke emission used in this study is specified from the Fire Locating and Modeling of Burning Emissions (FLAMBE) database derived from Moderate Resolution Imaging Spectroradiometer (MODIS) fire products. Model performance is evaluated using MODIS true color images, measured Aerosol Optical Depth (AOD) from space-borne MODIS (550 nm) and ground-based AERONET (500 nm), and Cloud-Aerosol Lidar data with Orthogonal Polarization (CALIOP) level 1 and 2 products. The simulated smoke transport is in good agreement with the validation data. Analyzing from three smoke events, smoke is constrained in a narrow belt between the Equator and 10°N near the surface, with the interplay of trade winds, subtropical high, and ITCZ. At the 700 hpa level, smoke expands farther meridionally. Topography blocks the smoke transport to the southeast of study area, because of high mountains located near the Great Rift Valley region. The simulation with injection height of 650 m is consistent with CALIOP measurements. The particular phenomenon, aerosol above cloud, is studied statistically from CALIOP observations. The total percentage of aerosol above cloud is about 5%.

  3. Modeling of Passive Acoustic Liners from High Fidelity Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Ferrari, Marcello do Areal Souto

    Noise reduction in aviation has been an important focus of study in the last few decades. One common solution is setting up acoustic liners in the internal walls of the engines. However, measurements in the laboratory with liners are expensive and time consuming. The present work proposes a nonlinear physics-based time domain model to predict the acoustic behavior of a given liner in a defined flow condition. The parameters of the model are defined by analysis of accurate numerical solutions of the flow obtained from a high-fidelity numerical code. The length of the cavity is taken into account by using an analytical procedure to account for internal reflections in the interior of the cavity. Vortices and jets originated from internal flow separations are confirmed to be important mechanisms of sound absorption, which defines the overall efficiency of the liner. Numerical simulations at different frequency, geometry and sound pressure level are studied in detail to define the model parameters. Comparisons with high-fidelity numerical simulations show that the proposed model is accurate, robust, and can be used to define a boundary condition simulating a liner in a high-fidelity code.

  4. Mechanical Modeling and Computer Simulation of Protein Folding

    ERIC Educational Resources Information Center

    Prigozhin, Maxim B.; Scott, Gregory E.; Denos, Sharlene

    2014-01-01

    In this activity, science education and modern technology are bridged to teach students at the high school and undergraduate levels about protein folding and to strengthen their model building skills. Students are guided from a textbook picture of a protein as a rigid crystal structure to a more realistic view: proteins are highly dynamic…

  5. High pressure inactivation of HAV within oysters: comparison of shucked oysters with whole in shell meats

    USDA-ARS?s Scientific Manuscript database

    High pressure inactivation of hepatitis A virus (HAV) within oysters bioaccumulated under simulated natural conditions to levels >106 PFU/oyster has been evaluated. Five min treatments at 20C were administered at 350, 375, and 400 MegaPascals (MPa). Shucked and whole-in-shell oysters were directly...

  6. Collaborating CPU and GPU for large-scale high-order CFD simulations with complex grids on the TianHe-1A supercomputer

    NASA Astrophysics Data System (ADS)

    Xu, Chuanfu; Deng, Xiaogang; Zhang, Lilun; Fang, Jianbin; Wang, Guangxue; Jiang, Yi; Cao, Wei; Che, Yonggang; Wang, Yongxian; Wang, Zhenghua; Liu, Wei; Cheng, Xinghua

    2014-12-01

    Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations for high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU-GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.

  7. Collaborating CPU and GPU for large-scale high-order CFD simulations with complex grids on the TianHe-1A supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Chuanfu, E-mail: xuchuanfu@nudt.edu.cn; Deng, Xiaogang; Zhang, Lilun

    Programming and optimizing complex, real-world CFD codes on current many-core accelerated HPC systems is very challenging, especially when collaborating CPUs and accelerators to fully tap the potential of heterogeneous systems. In this paper, with a tri-level hybrid and heterogeneous programming model using MPI + OpenMP + CUDA, we port and optimize our high-order multi-block structured CFD software HOSTA on the GPU-accelerated TianHe-1A supercomputer. HOSTA adopts two self-developed high-order compact definite difference schemes WCNS and HDCS that can simulate flows with complex geometries. We present a dual-level parallelization scheme for efficient multi-block computation on GPUs and perform particular kernel optimizations formore » high-order CFD schemes. The GPU-only approach achieves a speedup of about 1.3 when comparing one Tesla M2050 GPU with two Xeon X5670 CPUs. To achieve a greater speedup, we collaborate CPU and GPU for HOSTA instead of using a naive GPU-only approach. We present a novel scheme to balance the loads between the store-poor GPU and the store-rich CPU. Taking CPU and GPU load balance into account, we improve the maximum simulation problem size per TianHe-1A node for HOSTA by 2.3×, meanwhile the collaborative approach can improve the performance by around 45% compared to the GPU-only approach. Further, to scale HOSTA on TianHe-1A, we propose a gather/scatter optimization to minimize PCI-e data transfer times for ghost and singularity data of 3D grid blocks, and overlap the collaborative computation and communication as far as possible using some advanced CUDA and MPI features. Scalability tests show that HOSTA can achieve a parallel efficiency of above 60% on 1024 TianHe-1A nodes. With our method, we have successfully simulated an EET high-lift airfoil configuration containing 800M cells and China's large civil airplane configuration containing 150M cells. To our best knowledge, those are the largest-scale CPU–GPU collaborative simulations that solve realistic CFD problems with both complex configurations and high-order schemes.« less

  8. Simulation of inclined air showers

    NASA Astrophysics Data System (ADS)

    Dorofeev, Alexei V.

    The purpose of this research is simulation of Horizontal Air Showers (HAS) - Extensive Air Showers (EAS), where the cascade of particles is initiated by a primary particle with Ultra High Energy, entering the atmosphere of the Earth at zenith angles more than 70°. Particles from these HAS are detected at the ground level by the Surface Detector part of the Auger Observatory. Existing simulation models (most of them are Monte-Carlo) have limitations which come from the fact that one can't follow each and every particle and interaction in the EAS. The proposed model is a semi-analytic solution to the cascade equations, which incorporates probability functions for the most advanced hadronic interaction models available today--UrQMD for the low-energy region and NEXUS for the high energy region.

  9. Power Hardware-in-the-Loop Testing of a Smart Distribution System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendoza Carrillo, Ismael; Breaden, Craig; Medley, Paige

    This paper presents the results of the third and final phase of the National Renewable Energy Lab (NREL) INTEGRATE demonstration: Smart Distribution. For this demonstration, high penetrations of solar PV and wind energy systems were simulated in a power hardware-in-the-loop set-up using a smart distribution test feeder. Simulated and real DERs were controlled by a real-time control platform, which manages grid constraints under high clean energy deployment levels. The power HIL testing, conducted at NREL's ESIF smart power lab, demonstrated how dynamically managing DER increases the grid's hosting capacity by leveraging active network management's (ANM) safe and reliable control framework.more » Results are presented for how ANM's real-time monitoring, automation, and control can be used to manage multiple DERs and multiple constraints associated with high penetrations of DER on a distribution grid. The project also successfully demonstrated the importance of escalating control actions given how ANM enables operation of grid equipment closer to their actual physical limit in the presence of very high levels of intermittent DER.« less

  10. Plant-Level Modeling and Simulation of Used Nuclear Fuel Dissolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Almeida, Valmor F.

    2012-09-07

    Plant-level modeling and simulation of a used nuclear fuel prototype dissolver is presented. Emphasis is given in developing a modeling and simulation approach to be explored by other processes involved in the recycle of used fuel. The commonality concepts presented in a previous communication were used to create a model and realize its software module. An initial model was established based on a theory of chemical thermomechanical network transport outlined previously. A software module prototype was developed with the required external behavior and internal mathematical structure. Results obtained demonstrate the generality of the design approach and establish an extensible mathematicalmore » model with its corresponding software module for a wide range of dissolvers. Scale up numerical tests were made varying the type of used fuel (breeder and light-water reactors) and the capacity of dissolution (0.5 t/d to 1.7 t/d). These tests were motivated by user requirements in the area of nuclear materials safeguards. A computer module written in high-level programing languages (MATLAB and Octave) was developed, tested, and provided as open-source code (MATLAB) for integration into the Separations and Safeguards Performance Model application in development at Sandia National Laboratories. The modeling approach presented here is intended to serve as a template for a rational modeling of all plant-level modules. This will facilitate the practical application of the commonality features underlying the unifying network transport theory proposed recently. In addition, by example, this model describes, explicitly, the needed data from sub-scale models, and logical extensions for future model development. For example, from thermodynamics, an off-line simulation of molecular dynamics could quantify partial molar volumes for the species in the liquid phase; this simulation is currently at reach for high-performance computing. From fluid mechanics, a hold-up capacity function is needed for the dissolver device; this simulation is currently at reach for computational fluid mechanics given the existing CAD geometry. From chemical transport phenomena, a simulation of the particle-scale dissolution front is needed to derive an improved solid dissolution kinetics law by predicting the local surface area change; an example was provided in this report. In addition, the associated reaction mechanisms for dissolution are presently largely untested and simplified, hence even a parallel experimental program in reaction kinetics is needed to support modeling and simulation efforts. Last but not least, a simple account of finite rates of solid feed and transfer can be readily introduced via a coupled delayed model. These are some of the theoretical benefits of a rational plant-level modeling approach which guides the development of smaller length and time scale modeling. Practical, and other theoretical benefits have been presented on a previous report.« less

  11. Effect of censoring trace-level water-quality data on trend-detection capability

    USGS Publications Warehouse

    Gilliom, R.J.; Hirsch, R.M.; Gilroy, E.J.

    1984-01-01

    Monte Carlo experiments were used to evaluate whether trace-level water-quality data that are routinely censored (not reported) contain valuable information for trend detection. Measurements are commonly censored if they fall below a level associated with some minimum acceptable level of reliability (detection limit). Trace-level organic data were simulated with best- and worst-case estimates of measurement uncertainty, various concentrations and degrees of linear trend, and different censoring rules. The resulting classes of data were subjected to a nonparametric statistical test for trend. For all classes of data evaluated, trends were most effectively detected in uncensored data as compared to censored data even when the data censored were highly unreliable. Thus, censoring data at any concentration level may eliminate valuable information. Whether or not valuable information for trend analysis is, in fact, eliminated by censoring of actual rather than simulated data depends on whether the analytical process is in statistical control and bias is predictable for a particular type of chemical analyses.

  12. Simulations of transient membrane behavior in cells subjected to a high-intensity ultrashort electric pulse.

    PubMed

    Hu, Q; Viswanadham, S; Joshi, R P; Schoenbach, K H; Beebe, S J; Blackmore, P F

    2005-03-01

    A molecular dynamics (MD) scheme is combined with a distributed circuit model for a self-consistent analysis of the transient membrane response for cells subjected to an ultrashort (nanosecond) high-intensity (approximately 0.01-V/nm spatially averaged field) voltage pulse. The dynamical, stochastic, many-body aspects are treated at the molecular level by resorting to a course-grained representation of the membrane lipid molecules. Coupling the Smoluchowski equation to the distributed electrical model for current flow provides the time-dependent transmembrane fields for the MD simulations. A good match between the simulation results and available experimental data is obtained. Predictions include pore formation times of about 5-6 ns. It is also shown that the pore formation process would tend to begin from the anodic side of an electrically stressed membrane. Furthermore, the present simulations demonstrate that ions could facilitate pore formation. This could be of practical importance and have direct relevance to the recent observations of calcium release from the endoplasmic reticulum in cells subjected to such ultrashort, high-intensity pulses.

  13. 3D Simulations of NIF Wetted Foam Experiments to Understand the Transition from 2D to 3D Implosion Behavior

    NASA Astrophysics Data System (ADS)

    Haines, Brian; Olson, Richard; Yi, Austin; Zylstra, Alex; Peterson, Robert; Bradley, Paul; Shah, Rahul; Wilson, Doug; Kline, John; Leeper, Ramon; Batha, Steve

    2017-10-01

    The high convergence ratio (CR) of layered Inertial Confinement Fusion capsule implosions contribute to high performance in 1D simulations yet make them more susceptible to hydrodynamic instabilities, contributing to the development of 3D flows. The wetted foam platform is an approach to hot spot ignition to achieve low-to-moderate convergence ratios in layered implosions on the NIF unobtainable using an ice layer. Detailed high-resolution modeling of these experiments in 2D and 3D, including all known asymmetries, demonstrates that 2D hydrodynamics explain capsule performance at CR 12 but become less suitable as the CR increases. Mechanisms for this behavior and detailed comparisons of simulations to experiments on NIF will be presented. To evaluate the tradeoff between increased instability and improved 1D performance, we present a full-scale wetted foam capsule design with 17

  14. Sources and pathways of the upscale effects on the Southern Hemisphere jet in MPAS-CAM4 variable-resolution simulations

    DOE PAGES

    Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby; ...

    2016-10-22

    Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less

  15. Perceived Barriers to the Use of High-Fidelity Hands-On Simulation Training for Contrast Reaction Management: Why Programs are Not Using It.

    PubMed

    Chinnugounder, Sankar; Hippe, Daniel S; Maximin, Suresh; O'Malley, Ryan B; Wang, Carolyn L

    2015-01-01

    Although subjective and objective benefits of high-fidelity simulation have been reported in medicine, there has been slow adoption in radiology. The purpose of our study was to identify the perceived barriers in the use of high-fidelity hands-on simulation for contrast reaction management training. An IRB exempt 32 questions online web survey was sent to 179 non-military radiology residency program directors listed in the Fellowship and Residency Electronic Interactive Database Access system (FREIDA). Survey questions included the type of contrast reaction management training, cost, time commitment of residents and faculty, and the reasons for not using simulation training. Responses from the survey were summarized as count (percentage), mean ± standard deviation (SD), or median (range). 84 (47%) of 179 programs responded, of which 88% offered CRM training. Most (72%) conducted the CRM training annually while only 4% conducted it more frequently. Didactic lecture was the most frequently used training modality (97%), followed by HFS (30%) and computer-based simulation (CBS) (19%); 5.5% used both HFS and CBS. Of the 51 programs that offer CRM training but do not use HFS, the most common reason reported was insufficient availability (41%). Other reported reasons included cost (33%), no access to simulation centers (33%), lack of trained faculty (27%) and time constraints (27%). Although high-fidelity hands-on simulation training is the best way to reproduce real-life contrast reaction scenarios, many institutions do not provide this training due to constraints such as cost, lack of access or insufficient availability of simulation labs, and lack of trained faculty. As a specialty, radiology needs to better address these barriers at both an institutional and national level. Copyright © 2015 Mosby, Inc. All rights reserved.

  16. Sources and pathways of the upscale effects on the Southern Hemisphere jet in MPAS-CAM4 variable-resolution simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakaguchi, Koichi; Lu, Jian; Leung, L. Ruby

    Impacts of regional grid refinement on large-scale circulations (“upscale effects”) were detected in a previous study that used the Model for Prediction Across Scales-Atmosphere coupled to the physics parameterizations of the Community Atmosphere Model version 4. The strongest upscale effect was identified in the Southern Hemisphere jet during austral winter. This study examines the detailed underlying processes by comparing two simulations at quasi-uniform resolutions of 30 and 120 km to three variable-resolution simulations in which the horizontal grids are regionally refined to 30 km in North America, South America, or Asia from 120 km elsewhere. In all the variable-resolution simulations,more » precipitation increases in convective areas inside the high-resolution domains, as in the reference quasi-uniform high-resolution simulation. With grid refinement encompassing the tropical Americas, the increased condensational heating expands the local divergent circulations (Hadley cell) meridionally such that their descending branch is shifted poleward, which also pushes the baroclinically unstable regions, momentum flux convergence, and the eddy-driven jet poleward. This teleconnection pathway is not found in the reference high-resolution simulation due to a strong resolution sensitivity of cloud radiative forcing that dominates the aforementioned teleconnection signals. The regional refinement over Asia enhances Rossby wave sources and strengthens the upper level southerly flow, both facilitating the cross-equatorial propagation of stationary waves. Evidence indicates that this teleconnection pathway is also found in the reference high-resolution simulation. Lastly, the result underlines the intricate diagnoses needed to understand the upscale effects in global variable-resolution simulations, with implications for science investigations using the computationally efficient modeling framework.« less

  17. Large liquid rocket engine transient performance simulation system

    NASA Technical Reports Server (NTRS)

    Mason, J. R.; Southwick, R. D.

    1991-01-01

    A simulation system, ROCETS, was designed and developed to allow cost-effective computer predictions of liquid rocket engine transient performance. The system allows a user to generate a simulation of any rocket engine configuration using component modules stored in a library through high-level input commands. The system library currently contains 24 component modules, 57 sub-modules and maps, and 33 system routines and utilities. FORTRAN models from other sources can be operated in the system upon inclusion of interface information on comment cards. Operation of the simulation is simplified for the user by run, execution, and output processors. The simulation system makes available steady-state trim balance, transient operation, and linear partial generation. The system utilizes a modern equation solver for efficient operation of the simulations. Transient integration methods include integral and differential forms for the trapezoidal, first order Gear, and second order Gear corrector equations. A detailed technology test bed engine (TTBE) model was generated to be used as the acceptance test of the simulation system. The general level of model detail was that reflected in the Space Shuttle Main Engine DTM. The model successfully obtained steady-state balance in main stage operation and simulated throttle transients, including engine starts and shutdown. A NASA FORTRAN control model was obtained, ROCETS interface installed in comment cards, and operated with the TTBE model in closed-loop transient mode.

  18. Multi-level Simulation of a Real Time Vibration Monitoring System Component

    NASA Technical Reports Server (NTRS)

    Robertson, Bryan A.; Wilkerson, Delisa

    2005-01-01

    This paper describes the development of a custom built Digital Signal Processing (DSP) printed circuit board designed to implement the Advanced Real Time Vibration Monitoring Subsystem proposed by Marshall Space Flight Center (MSFC) Transportation Directorate in 2000 for the Space Shuttle Main Engine Advanced Health Management System (AHMS). This Real Time Vibration Monitoring System (RTVMS) is being developed for ground use as part of the AHMS Health Management Computer-Integrated Rack Assembly (HMC-IRA). The HMC-IRA RTVMS design contains five DSPs which are highly interconnected through individual communication ports, shared memory, and a unique communication router that allows all the DSPs to receive digitized data fiom two multi-channel analog boards simultaneously. This paper will briefly cover the overall board design but will focus primarily on the state-of-the-art simulation environment within which this board was developed. This 16-layer board with over 1800 components and an additional mezzanine card has been an extremely challenging design. Utilization of a Mentor Graphics simulation environment provided the unique board and system level simulation capability to ascertain any timing or functional concerns before production. By combining VHDL, Synopsys Software and Hardware Models, and the Mentor Design Capture Environment, multiple simulations were developed to verify the RTVMS design. This multi-level simulation allowed the designers to achieve complete operability without error the first time the RTVMS printed circuit board was powered. The HMC-IRA design has completed all engineering and deliverable unit testing. P

  19. Multi-level Simulation of a Real Time Vibration Monitoring System Component

    NASA Technical Reports Server (NTRS)

    Roberston, Bryan; Wilkerson, DeLisa

    2004-01-01

    This paper describes the development of a custom built Digital Signal Processing (DSP) printed circuit board designed to implement the Advanced Real Time Vibration Monitoring Subsystem proposed by MSFC Transportation Directorate in 2000 for the Space Shuttle Main Engine Advanced Health Management System (AHMS). This Real Time Vibration Monitoring System (RTVMS) is being developed for ground use as part of the AHMS Health Management Computer-Integrated Rack Assembly (HMC-IRA). The HMC-IRA RTVMS design contains five DSPs which are highly interconnected through individual communication ports, shared memory, and a unique communication router that allows all the DSPs to receive digitized data from two multi-channel analog boards simultaneously. This paper will briefly cover the overall board design but will focus primarily on the state-of-the-art simulation environment within which this board was developed. This 16-layer board with over 1800 components and an additional mezzanine card has been an extremely challenging design. Utilization of a Mentor Graphics simulation environment provided the unique board and system level simulation capability to ascertain any timing or functional concerns before production. By combining VHDL, Synopsys Software and Hardware Models, and the Mentor Design Capture Environment, multiple simulations were developed to verify the RTVMS design. This multi-level simulation allowed the designers to achieve complete operability without error the first time the RTVMS printed circuit board was powered. The HMCIRA design has completed all engineering unit testing and the deliverable unit is currently under development.

  20. Improved cloud parameterization for Arctic climate simulations based on satellite data

    NASA Astrophysics Data System (ADS)

    Klaus, Daniel; Dethloff, Klaus; Dorn, Wolfgang; Rinke, Annette

    2015-04-01

    The defective representation of Arctic cloud processes and properties remains a crucial problem in climate modelling and in reanalysis products. Satellite-based cloud observations (MODIS and CPR/CALIOP) and single-column model simulations (HIRHAM5-SCM) were exploited to evaluate and improve the simulated Arctic cloud cover of the atmospheric regional climate model HIRHAM5. The ECMWF reanalysis dataset 'ERA-Interim' (ERAint) was used for the model initialization, the lateral boundary forcing as well as the dynamical relaxation inside the pan-Arctic domain. HIRHAM5 has a horizontal resolution of 0.25° and uses 40 pressure-based and terrain-following vertical levels. In comparison with the satellite observations, the HIRHAM5 control run (HH5ctrl) systematically overestimates total cloud cover, but to a lesser extent than ERAint. The underestimation of high- and mid-level clouds is strongly outweighed by the overestimation of low-level clouds. Numerous sensitivity studies with HIRHAM5-SCM suggest (1) the parameter tuning, enabling a more efficient Bergeron-Findeisen process, combined with (2) an extension of the prognostic-statistical (PS) cloud scheme, enabling the use of negatively skewed beta distributions. This improved model setup was then used in a corresponding HIRHAM5 sensitivity run (HH5sens). While the simulated high- and mid-level cloud cover is improved only to a limited extent, the large overestimation of low-level clouds can be systematically and significantly reduced, especially over sea ice. Consequently, the multi-year annual mean area average of total cloud cover with respect to sea ice is almost 14% lower than in HH5ctrl. Overall, HH5sens slightly underestimates the observed total cloud cover but shows a halved multi-year annual mean bias of 2.2% relative to CPR/CALIOP at all latitudes north of 60° N. Importantly, HH5sens produces a more realistic ratio between the cloud water and ice content. The considerably improved cloud simulation manifests in a more correct radiative transfer and better energy budget in the atmospheric boundary layer and results also in a more realistic surface energy budget associated with more reasonable turbulent fluxes. All this mitigates the positive temperature, relative humidity and horizontal wind speed biases in the lower model levels.

  1. Hierarchical coarse-graining strategy for protein-membrane systems to access mesoscopic scales

    PubMed Central

    Ayton, Gary S.; Lyman, Edward

    2014-01-01

    An overall multiscale simulation strategy for large scale coarse-grain simulations of membrane protein systems is presented. The protein is modeled as a heterogeneous elastic network, while the lipids are modeled using the hybrid analytic-systematic (HAS) methodology, where in both cases atomistic level information obtained from molecular dynamics simulation is used to parameterize the model. A feature of this approach is that from the outset liposome length scales are employed in the simulation (i.e., on the order of ½ a million lipids plus protein). A route to develop highly coarse-grained models from molecular-scale information is proposed and results for N-BAR domain protein remodeling of a liposome are presented. PMID:20158037

  2. Design of 3D simulation engine for oilfield safety training

    NASA Astrophysics Data System (ADS)

    Li, Hua-Ming; Kang, Bao-Sheng

    2015-03-01

    Aiming at the demand for rapid custom development of 3D simulation system for oilfield safety training, this paper designs and implements a 3D simulation engine based on script-driven method, multi-layer structure, pre-defined entity objects and high-level tools such as scene editor, script editor, program loader. A scripting language been defined to control the system's progress, events and operating results. Training teacher can use this engine to edit 3D virtual scenes, set the properties of entity objects, define the logic script of task, and produce a 3D simulation training system without any skills of programming. Through expanding entity class, this engine can be quickly applied to other virtual training areas.

  3. Undergraduate interprofessional education using high-fidelity paediatric simulation.

    PubMed

    Stewart, Moira; Kennedy, Neil; Cuene-Grandidier, Hazel

    2010-06-01

    High-fidelity simulation is becoming increasingly important in the delivery of teaching and learning to health care professionals within a safe environment. Its use in an interprofessional context and at undergraduate level has the potential to facilitate the learning of good communication and teamworking, in addition to clinical knowledge and skills. Interprofessional teaching and learning workshops using high-fidelity paediatric simulation were developed and delivered to undergraduate medical and nursing students at Queen's University Belfast. Learning outcomes common to both professions, and essential in the clinical management of sick children, included basic competencies, communication and teamworking skills. Quantitative and qualitative evaluation was undertaken using published questionnaires. Quantitative results - the 32-item questionnaire was analysed for reliability using spss. Responses were positive for both groups of students across four domains - acquisition of knowledge and skills, communication and teamworking, professional identity and role awareness, and attitudes to shared learning. Qualitative results - thematic content analysis was used to analyse open-ended responses. Students from both groups commented that an interprofessional education (IPE) approach to paediatric simulation improved clinical and practice-based skills, and provided a safe learning environment. Students commented that there should be more interprofessional and simulation learning opportunities. High-fidelity paediatric simulation, used in an interprofessional context, has the potential to meet the requirements of undergraduate medical and nursing curricula. Further research is needed into the long-term benefits for patient care, and its generalisability to other areas within health care teaching and learning. © Blackwell Publishing Ltd 2010.

  4. Sensitivity Analysis for CO2 Retrieval using GOSAT-2 FTS-2 Simulator

    NASA Astrophysics Data System (ADS)

    Kamei, Akihide; Yoshida, Yukio; Dupuy, Eric; Yokota, Yasuhiro; Hiraki, Kaduo; Matsunaga, Tsuneo

    2015-04-01

    The Greenhouse Gases Observing Satellite (GOSAT), launched in 2009, is the world's first satellite dedicated to global greenhouse gases observation. GOSAT-2, the successor mission to GOSAT, is scheduled for launch in early 2018. The Fourier Transform Spectrometer-2 (FTS-2) is the primary sensor onboard GOSAT-2. It observes infrared light reflected and emitted from the Earth's surface and atmosphere. The FTS-2 obtains high resolution spectra using three bands in the near to short-wavelength infrared (SWIR) region and two bands in the thermal infrared (TIR) region. Column amounts and vertical profiles of carbon dioxide (CO2) and methane (CH4) are retrieved from the radiance spectra obtained with the SWIR and TIR bands, respectively. Further, compared to the FTS onboard the GOSAT, the FTS-2 has several improvements: 1) added spectral coverage in the SWIR region for carbon monoxide (CO) retrieval, 2) increased signal-to-noise ratio (SNR) for all bands, 3) extended range of along-track pointing angles for sunglint observations, 4) intelligent pointing to avoid cloud contamination. Since 2012, we have been developing a simulator software to simulate the spectral radiance data that will be acquired by the GOSAT-2 FTS-2. The purpose of the GOSAT-2 FTS-2 simulator is to analyze/optimize data with respect to the sensor specification, the parameters for Level 1 processing, and the improvement of the Level 2 algorithms. The GOSAT-2 FTS-2 simulator includes the six components: 1) overall control, 2) sensor carrying platform, 3) spectral radiance calculation, 4) Fourier Transform module, 5) Level 1B (L1B) processing, and 6) L1B data output. It has been installed on the GOSAT Research Computation Facility (GOSAT RCF), which is a high-performance and energy-efficient supercomputer. More realistic and faster simulations have been made possible by the improvement of the details of sensor characteristics, the sophistication of the data processing and algorithms, the addition of the various observing modes including calibration observation, the use of surface and atmospheric ancillary data for radiative transfer calculation, and the speed-up and parallelization of the radiative transfer code. We will summarize the current status and the future plans in the development of the GOSAT-2 FTS-2 simulator. We will also demonstrate the reproduction of GOSAT FTS L1B data and present the sensitivity analysis relating to the engineering parameters, the aerosols and clouds, and so on, on the Level 1 processing for CO2 retrieval using latest version of the GOSAT-2 FTS-2 simulator.

  5. Preliminary Numerical Simulation of IR Structure Development in a Hypothetical Uranium Release.

    DTIC Science & Technology

    1981-11-16

    art Identify by block nAsb.’) IR Structure Power spectrum Uranium release Parallax effects Numerical simulation PHARO code Isophots LWIR 20. _PSTRACT...release at 200 km altitude. Of interest is the LWIR emission from uranium oxide ions, induced by sunlight and earthshine. Assuming a one-level fluid...defense systems of long wave infrared ( LWIR ) emissions from metallic oxides in the debris from a high altitude nuclear explosion (HANE) is an

  6. Modelling and Simulation in the Design Process of Armored Vehicles

    DTIC Science & Technology

    2003-03-01

    trackway conditions is a demanding optimization task. Basically, a high level of ride comfort requires soft suspension tuning, whereas driving safety relies...The maximum off-road speed is generally limited by traction, input torque, driving safety and ride comfort. When obstacles are to be negotiated, the...wheel travel was defined during the mobility simulation runs. Figure 14: Ramp 1.5m at 40 kph; virtual and physical prototype Driving safety and ride

  7. The theoretical simulation on electrostatic distribution of 1st proximity region in proximity focusing low-light-level image intensifier

    NASA Astrophysics Data System (ADS)

    Zhang, Liandong; Bai, Xiaofeng; Song, De; Fu, Shencheng; Li, Ye; Duanmu, Qingduo

    2015-03-01

    Low-light-level night vision technology is magnifying low light level signal large enough to be seen by naked eye, which uses the photons - photoelectron as information carrier. Until the micro-channel plate was invented, it has been possibility for the realization of high performance and miniaturization of low-light-level night vision device. The device is double-proximity focusing low-light-level image intensifier which places a micro-channel plate close to photocathode and phosphor screen. The advantages of proximity focusing low-light-level night vision are small size, light weight, small power consumption, no distortion, fast response speed, wide dynamic range and so on. It is placed parallel to each other for Micro-channel plate (both sides of it with metal electrode), the photocathode and the phosphor screen are placed parallel to each other. The voltage is applied between photocathode and the input of micro-channel plate when image intensifier works. The emission electron excited by photo on the photocathode move towards to micro-channel plate under the electric field in 1st proximity focusing region, and then it is multiplied through the micro-channel. The movement locus of emission electrons can be calculated and simulated when the distributions of electrostatic field equipotential lines are determined in the 1st proximity focusing region. Furthermore the resolution of image tube can be determined. However the distributions of electrostatic fields and equipotential lines are complex due to a lot of micro-channel existing in the micro channel plate. This paper simulates electrostatic distribution of 1st proximity region in double-proximity focusing low-light-level image intensifier with the finite element simulation analysis software Ansoft maxwell 3D. The electrostatic field distributions of 1st proximity region are compared when the micro-channel plates' pore size, spacing and inclination angle ranged. We believe that the electron beam movement trajectory in 1st proximity region will be better simulated when the electronic electrostatic fields are simulated.

  8. High Resolution Aerospace Applications using the NASA Columbia Supercomputer

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri J.; Aftosmis, Michael J.; Berger, Marsha

    2005-01-01

    This paper focuses on the parallel performance of two high-performance aerodynamic simulation packages on the newly installed NASA Columbia supercomputer. These packages include both a high-fidelity, unstructured, Reynolds-averaged Navier-Stokes solver, and a fully-automated inviscid flow package for cut-cell Cartesian grids. The complementary combination of these two simulation codes enables high-fidelity characterization of aerospace vehicle design performance over the entire flight envelope through extensive parametric analysis and detailed simulation of critical regions of the flight envelope. Both packages. are industrial-level codes designed for complex geometry and incorpor.ats. CuStomized multigrid solution algorithms. The performance of these codes on Columbia is examined using both MPI and OpenMP and using both the NUMAlink and InfiniBand interconnect fabrics. Numerical results demonstrate good scalability on up to 2016 CPUs using the NUMAIink4 interconnect, with measured computational rates in the vicinity of 3 TFLOP/s, while InfiniBand showed some performance degradation at high CPU counts, particularly with multigrid. Nonetheless, the results are encouraging enough to indicate that larger test cases using combined MPI/OpenMP communication should scale well on even more processors.

  9. An innovative approach for modeling and simulation of an automated industrial robotic arm operated electro-pneumatically

    NASA Astrophysics Data System (ADS)

    Popa, L.; Popa, V.

    2017-08-01

    The article is focused on modeling an automated industrial robotic arm operated electro-pneumatically and to simulate the robotic arm operation. It is used the graphic language FBD (Function Block Diagram) to program the robotic arm on Zelio Logic automation. The innovative modeling and simulation procedures are considered specific problems regarding the development of a new type of technical products in the field of robotics. Thus, were identified new applications of a Programmable Logic Controller (PLC) as a specialized computer performing control functions with a variety of high levels of complexit.

  10. Delegate, Collaborate, or Consult? A Capstone Simulation for Senior Nursing Students.

    PubMed

    Nowell, Lorelli S

    2016-01-01

    Clinical experiences are educational and fulfilling for both students and faculty; however, challenges arise in providing students with a variety of experiences where the leadership skills of prioritizing, collaborating, consulting, and delegating care can be developed. This article reports on a capstone simulation created to develop and sustain the prioritization, organization, and delegation skills of fourth year nursing students. Through the introduction of a multipatient simulation prior to graduation, nursing students will have a better understanding of the high-level leadership skills practicing registered nurses must possess in today's demanding health care environment.

  11. Obstetric team simulation program challenges.

    PubMed

    Bullough, A S; Wagner, S; Boland, T; Waters, T P; Kim, K; Adams, W

    2016-12-01

    To describe the challenges associated with the development and assessment of an obstetric emergency team simulation program. The goal was to develop a hybrid, in-situ and high fidelity obstetric emergency team simulation program that incorporated weekly simulation sessions on the labor and delivery unit, and quarterly, education protected sessions in the simulation center. All simulation sessions were video-recorded and reviewed. Labor and delivery unit and simulation center. Medical staff covering labor and delivery, anesthesiology and obstetric residents and obstetric nurses. Assessments included an on-line knowledge multiple-choice questionnaire about the simulation scenarios. This was completed prior to the initial in-situ simulation session and repeated 3 months later, the Clinical Teamwork Scale with inter-rater reliability, participant confidence surveys and subjective participant satisfaction. A web-based curriculum comprising modules on communication skills, team challenges, and team obstetric emergency scenarios was also developed. Over 4 months, only 6 labor and delivery unit in-situ sessions out of a possible 14 sessions were carried out. Four high-fidelity sessions were performed in 2 quarterly education protected meetings in the simulation center. Information technology difficulties led to the completion of only 18 pre/post web-based multiple-choice questionnaires. These test results showed no significant improvement in raw score performance from pre-test to post-test (P=.27). During Clinical Teamwork Scale live and video assessment, trained raters and program faculty were in agreement only 31% and 28% of the time, respectively (Kendall's W=.31, P<.001 and W=.28, P<.001). Participant confidence surveys overall revealed confidence significantly increased (P<.05), from pre-scenario briefing to after post-scenario debriefing. Program feedback indicates a high level of participant satisfaction and improved confidence yet further program refinement is required. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. A review of the curriculum development process of simulation-based educational intervention studies in Korea.

    PubMed

    Lee, Ju-Young; Lee, Soon Hee; Kim, Jung-Hee

    2018-05-01

    Despite the increase in simulators at nursing schools and the high expectations regarding simulation for nursing education, the unique features of integrating simulation-based education into the curriculum are unclear. The purpose of this study was to assess the curriculum development process of simulation-based educational interventions in nursing in Korea. Integrative review of literature used. Korean Studies Information Services System (KISS), Korean Medical Database (KMbase), KoreaMed, Research Information Sharing Service (RISS), and National Digital Library (NDL). Comprehensive databases were searched for records without a time limit (until December 2016), using terms such as "nursing," "simulation," and "education." A total of 1006 studies were screened. According to the model for simulation-based curriculum development (Khamis et al., 2016), the quality of reporting on the curriculum development was reviewed. A total of 125 papers were included in this review. In three studies, simulation scenarios were made from easy to difficulty levels, and none of the studies presented the level of learners' proficiency. Only 17.6% of the studies reported faculty development or preparation. The inter-rater reliability was presented in performance test by 24 studies and two studies evaluated the long-term effects of simulation education although there was no statistically significant change in terms of publication years. These findings suggest that educators and researchers should pay more attention to the educational strategies to integrate simulation into nursing education. It could contribute to guiding educators and researchers to develop a simulation-based curriculum and improve the quality of nursing education research. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. A kinematic/kinetic hybrid airplane simulator model : draft.

    DOT National Transportation Integrated Search

    2008-01-01

    A kinematics-based flight model, for normal flight : regimes, currently uses precise flight data to achieve a high : level of aircraft realism. However, it was desired to further : increase the models accuracy, without a substantial increase in : ...

  14. A kinematic/kinetic hybrid airplane simulator model.

    DOT National Transportation Integrated Search

    2008-01-01

    A kinematics-based flight model, for normal flight : regimes, currently uses precise flight data to achieve a high : level of aircraft realism. However, it was desired to further : increase the models accuracy, without a substantial increase in : ...

  15. Voltage-Load Sensitivity Matrix Based Demand Response for Voltage Control in High Solar Penetration Distribution Feeders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Xiangqi; Wang, Jiyu; Mulcahy, David

    This paper presents a voltage-load sensitivity matrix (VLSM) based voltage control method to deploy demand response resources for controlling voltage in high solar penetration distribution feeders. The IEEE 123-bus system in OpenDSS is used for testing the performance of the preliminary VLSM-based voltage control approach. A load disaggregation process is applied to disaggregate the total load profile at the feeder head to each load nodes along the feeder so that loads are modeled at residential house level. Measured solar generation profiles are used in the simulation to model the impact of solar power on distribution feeder voltage profiles. Different casemore » studies involving various PV penetration levels and installation locations have been performed. Simulation results show that the VLSM algorithm performance meets the voltage control requirements and is an effective voltage control strategy.« less

  16. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  17. Dissipative particle dynamics of diffusion-NMR requires high Schmidt-numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Azhar, Mueed; Greiner, Andreas; Korvink, Jan G., E-mail: jan.korvink@kit.edu, E-mail: david.kauzlaric@imtek.uni-freiburg.de

    We present an efficient mesoscale model to simulate the diffusion measurement with nuclear magnetic resonance (NMR). On the level of mesoscopic thermal motion of fluid particles, we couple the Bloch equations with dissipative particle dynamics (DPD). Thereby we establish a physically consistent scaling relation between the diffusion constant measured for DPD-particles and the diffusion constant of a real fluid. The latter is based on a splitting into a centre-of-mass contribution represented by DPD, and an internal contribution which is not resolved in the DPD-level of description. As a consequence, simulating the centre-of-mass contribution with DPD requires high Schmidt numbers. Aftermore » a verification for fundamental pulse sequences, we apply the NMR-DPD method to NMR diffusion measurements of anisotropic fluids, and of fluids restricted by walls of microfluidic channels. For the latter, the free diffusion and the localisation regime are considered.« less

  18. Corrosion behavior of Alloy 690 and Alloy 693 in simulated nuclear high level waste medium

    NASA Astrophysics Data System (ADS)

    Samantaroy, Pradeep Kumar; Suresh, Girija; Paul, Ranita; Kamachi Mudali, U.; Raj, Baldev

    2011-11-01

    Nickel based alloys are candidate materials for the storage of high level waste (HLW) generated from reprocessing of spent nuclear fuel. In the present investigation Alloy 690 and Alloy 693 are assessed by potentiodynamic anodic polarization technique for their corrosion behavior in 3 M HNO 3, 3 M HNO 3 containing simulated HLW and in chloride medium. Both the alloys were found to possess good corrosion resistance in both the media at ambient condition. Microstructural examination was carried out by SEM for both the alloys after electrolytic etching. Compositional analysis of the passive film formed on the alloys in 3 M HNO 3 and 3 M HNO 3 with HLW was carried out by XPS. The surface of Alloy 690 and Alloy 693, both consists of a thin layer of oxide of Ni, Cr, and Fe under passivation in both the media. The results of investigation are presented in the paper.

  19. Viscoplasticity of simulated high-level radioactive waste glass containing platinum group metal particles

    NASA Astrophysics Data System (ADS)

    Uruga, Kazuyoshi; Usami, Tsuyoshi; Tsukada, Takeshi; Komamine, Satoshi; Ochi, Eiji

    2014-09-01

    The shear rate dependency of the viscosity of three simulated high-level radioactive waste glasses containing 0, 1.2 and 4.5 wt% platinum group metals (PGMs) was examined at a temperature range of 1173-1473 K by a rotating viscometer. Shear stress when the shear rate equals zero, i.e. yield stress, was also measured by capillary method. The viscosity of the glass containing no PGM was shear rate-independent Newtonian fluid. On the other hand, the apparent viscosity of the glasses containing PGMs increased with decreasing shear rate, and nonzero amount of yield stresses were detected from both glasses. The viscosity and yield stress of the glass containing 4.5 wt% PGMs was roughly one to two orders of magnitude greater than the glass containing 1.2 wt% PGMs. These viscoplastic properties were numerically expressed by Casson equation.

  20. Implementing a Cardiac Skills Orientation and Simulation Program.

    PubMed

    Hemingway, Maureen W; Osgood, Patrice; Mannion, Mildred

    2018-02-01

    Patients with cardiac morbidities admitted for cardiac surgical procedures require perioperative nurses with a high level of complex nursing skills. Orienting new cardiac team members takes commitment and perseverance in light of variable staffing levels, high-acuity patient populations, an active cardiac surgical schedule, and the unpredictability of scheduling patients undergoing cardiac transplantation. At an academic medical center in Boston, these issues presented opportunities to orient new staff members to the scrub person role, but hampered efforts to provide active learning opportunities in a safe environment. As a result, facility personnel created a program to increase new staff members' skills, confidence, and proficiency, while also increasing the number of staff members who were proficient at scrubbing complex cardiac procedures. To address the safe learning requirement, personnel designed a simulation program to provide scrubbing experience, decrease orientees' supervision time, and increase staff members' confidence in performing the scrub person role. © AORN, Inc, 2018.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Langer, Steven H.; Karlin, Ian; Marinak, Marty M.

    HYDRA is used to simulate a variety of experiments carried out at the National Ignition Facility (NIF) [4] and other high energy density physics facilities. HYDRA has packages to simulate radiation transfer, atomic physics, hydrodynamics, laser propagation, and a number of other physics effects. HYDRA has over one million lines of code and includes both MPI and thread-level (OpenMP and pthreads) parallelism. This paper measures the performance characteristics of HYDRA using hardware counters on an IBM BlueGene/Q system. We report key ratios such as bytes/instruction and memory bandwidth for several different physics packages. The total number of bytes read andmore » written per time step is also reported. We show that none of the packages which use significant time are memory bandwidth limited on a Blue Gene/Q. HYDRA currently issues very few SIMD instructions. The pressure on memory bandwidth will increase if high levels of SIMD instructions can be achieved.« less

  2. Exhaust emission calibration of two J-58 afterburning turbojet engines at simulated high-altitude, supersonic flight conditions

    NASA Technical Reports Server (NTRS)

    Holdeman, J. D.

    1976-01-01

    Emissions of total oxides of nitrogen, nitric oxide, unburned hydrocarbons, carbon monoxide, and carbon dioxide from two J-58 afterburning turbojet engines at simulated high-altitude flight conditions are reported. Test conditions included flight speeds from Mach 2 to 3 at altitudes from 16.0 to 23.5 km. For each flight condition exhaust measurements were made for four or five power levels, from maximum power without afterburning through maximum afterburning. The data show that exhaust emissions vary with flight speed, altitude, power level, and radial position across the exhaust. Oxides of nitrogen emissions decreased with increasing altitude and increased with increasing flight speed. Oxides of nitrogen emission indices with afterburning were less than half the value without afterburning. Carbon monoxide and hydrocarbon emissions increased with increasing altitude and decreased with increasing flight speed. Emissions of these species were substantially higher with afterburning than without.

  3. Estimating Driving Performance Based on EEG Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Wu, Ruei-Cheng; Jung, Tzyy-Ping; Liang, Sheng-Fu; Huang, Teng-Yi

    2005-12-01

    The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG) log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  4. Efficient high-quality volume rendering of SPH data.

    PubMed

    Fraedrich, Roland; Auer, Stefan; Westermann, Rüdiger

    2010-01-01

    High quality volume rendering of SPH data requires a complex order-dependent resampling of particle quantities along the view rays. In this paper we present an efficient approach to perform this task using a novel view-space discretization of the simulation domain. Our method draws upon recent work on GPU-based particle voxelization for the efficient resampling of particles into uniform grids. We propose a new technique that leverages a perspective grid to adaptively discretize the view-volume, giving rise to a continuous level-of-detail sampling structure and reducing memory requirements compared to a uniform grid. In combination with a level-of-detail representation of the particle set, the perspective grid allows effectively reducing the amount of primitives to be processed at run-time. We demonstrate the quality and performance of our method for the rendering of fluid and gas dynamics SPH simulations consisting of many millions of particles.

  5. Hydroecological impacts of climate change modelled for a lowland UK wetland

    NASA Astrophysics Data System (ADS)

    House, Andrew; Acreman, Mike; Sorensen, James; Thompson, Julian

    2015-04-01

    Conservation management of wetlands often rests on modifying hydrological functions to establish or maintain desired flora and fauna. Hence the ability to predict the impacts of climate change is highly beneficial. Here, the physically based, distributed model MIKE SHE was used to simulate hydrology for the Lambourn Observatory at Boxford, UK. This comprises a 10 ha lowland riparian wetland protected for conservation, where the degree of variability in the peat, gravel and chalk geology has clouded hydrological understanding. Notably, a weathered layer on the chalk aquifer surface seals it from overlying deposits, yet is highly spatially heterogeneous. Long-term monitoring yielded observations of groundwater and surface water levels for model calibration and validation. Simulated results were consistent with observed data and reproduced the effects of seasonal fluctuations and in-channel macrophyte growth. The adjacent river and subsidiary channel were found to act as head boundaries, exerting a general control on water levels across the site. Discrete areas of groundwater upwellings caused raised water levels at distinct locations within the wetland. These were concurrent to regions where the weathered chalk layer is absent. To assess impacts of climate change, outputs from the UK Climate Projections 2009 ensemble of global climate models for the 2080s are used to obtain monthly percentage changes in climate variables. Changes in groundwater levels were taken from a regional model of the Chalk aquifer. Values of precipitation and evapotranspiration were seen to increase, whilst groundwater levels decreased, resulting in the greater dominance of precipitation. The discrete areas of groundwater upwelling were seen to diminish or disappear. Simulated water levels were linked to specific requirements of wetland plants using water table depth zone diagrams. Increasing depth of winter and summer groundwater levels leads to a loss of Glyceria maxima and Phragmites australis, principal habitat for the endangered Vertigo moulinsiana. Further, the reduced influx of base-rich groundwater and increased dominance of high pH rain-fed waters alters the acidity of the soil. This leads to changes in species composition, with potential reductions in Carex paniculata, Caltha palustris and Typha latifolia.

  6. Improved understanding of the recombination rate at inverted p+ silicon surfaces

    NASA Astrophysics Data System (ADS)

    To, Alexander; Ma, Fajun; Hoex, Bram

    2017-08-01

    The effect of positive fixed charge on the recombination rate at SiN x -passivated p+ surfaces is studied in this work. It is shown that a high positive fixed charge on a low defect density, passivated doped surface can result in a near injection level independent lifetime in a certain injection level range. This behaviour is modelled with advanced computer simulations using Sentaurus TCAD, which replicates the measurements conditions during a photoconductance based effective minority carrier lifetime measurement. The resulting simulations show that the shape of the injection level dependent lifetime is a result of the surface recombination rate, which is non-linear due to the surfaces moving into inversion with increasing injection level. As a result, the surface recombination rate switches from being limited by electrons to holes. Equations describing the surface saturation current density, J 0s, during this regime are also derived in this work.

  7. An Initial Examination for Verifying Separation Algorithms by Simulation

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Neogi, Natasha; Herencia-Zapana, Heber

    2012-01-01

    An open question in algorithms for aircraft is what can be validated by simulation where the simulation shows that the probability of undesirable events is below some given level at some confidence level. The problem is including enough realism to be convincing while retaining enough efficiency to run the large number of trials needed for high confidence. The paper first proposes a goal based on the number of flights per year in several regions. The paper examines the probabilistic interpretation of this goal and computes the number of trials needed to establish it at an equivalent confidence level. Since any simulation is likely to consider the algorithms for only one type of event and there are several types of events, the paper examines under what conditions this separate consideration is valid. This paper is an initial effort, and as such, it considers separation maneuvers, which are elementary but include numerous aspects of aircraft behavior. The scenario includes decisions under uncertainty since the position of each aircraft is only known to the other by broadcasting where GPS believes each aircraft to be (ADS-B). Each aircraft operates under feedback control with perturbations. It is shown that a scenario three or four orders of magnitude more complex is feasible. The question of what can be validated by simulation remains open, but there is reason to be optimistic.

  8. Virtual Reality Compared with Bench-Top Simulation in the Acquisition of Arthroscopic Skill: A Randomized Controlled Trial.

    PubMed

    Banaszek, Daniel; You, Daniel; Chang, Justues; Pickell, Michael; Hesse, Daniel; Hopman, Wilma M; Borschneck, Daniel; Bardana, Davide

    2017-04-05

    Work-hour restrictions as set forth by the Accreditation Council for Graduate Medical Education (ACGME) and other governing bodies have forced training programs to seek out new learning tools to accelerate acquisition of both medical skills and knowledge. As a result, competency-based training has become an important part of residency training. The purpose of this study was to directly compare arthroscopic skill acquisition in both high-fidelity and low-fidelity simulator models and to assess skill transfer from either modality to a cadaveric specimen, simulating intraoperative conditions. Forty surgical novices (pre-clerkship-level medical students) voluntarily participated in this trial. Baseline demographic data, as well as data on arthroscopic knowledge and skill, were collected prior to training. Subjects were randomized to 5-week independent training sessions on a high-fidelity virtual reality arthroscopic simulator or on a bench-top arthroscopic setup, or to an untrained control group. Post-training, subjects were asked to perform a diagnostic arthroscopy on both simulators and in a simulated intraoperative environment on a cadaveric knee. A more difficult surprise task was also incorporated to evaluate skill transfer. Subjects were evaluated using the Global Rating Scale (GRS), the 14-point arthroscopic checklist, and a timer to determine procedural efficiency (time per task). Secondary outcomes focused on objective measures of virtual reality simulator motion analysis. Trainees on both simulators demonstrated a significant improvement (p < 0.05) in arthroscopic skills compared with baseline scores and untrained controls, both in and ex vivo. The virtual reality simulation group consistently outperformed the bench-top model group in the diagnostic arthroscopy crossover tests and in the simulated cadaveric setup. Furthermore, the virtual reality group demonstrated superior skill transfer in the surprise skill transfer task. Both high-fidelity and low-fidelity simulation trainings were effective in arthroscopic skill acquisition. High-fidelity virtual reality simulation was superior to bench-top simulation in the acquisition of arthroscopic skills, both in the laboratory and in vivo. Further clinical investigation is needed to interpret the importance of these results.

  9. High Resolution Orientation Imaging Microscopy

    DTIC Science & Technology

    2012-05-02

    Structure of In-Situ Deformations of Steel , TMS, San Diego, 2011 13. Jay Basinger, David Fullwood, Brent Adams, EBSD Detail Extraction for Greater Spatial...Its use has contributed to the development of new steels , aluminum alloys, high TC superconductors, electronic materials, lead-free solders, optical...Resolution The simulated pattern method has been used to recover lattice tetragonality in high-strength low- alloy steels . Since the level of

  10. Design Notebook for Naval Air Defense Simulation (NADS). Special Programs.

    DTIC Science & Technology

    1982-09-15

    provides high level decision making and coordination among the elements of the defending force. A more detailed description of the command center...loiter, cruise, normal intercept, and high speed intercept. Appropriate fuel consumption rates are used for each speed. When on CAP station the...Stand-Off Jammer Aircraft SW aircraft carry high power electronic transmitting equipment capable of jaimming radars and communication channels from

  11. Design of Accelerator Online Simulator Server Using Structured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, Guobao; /Brookhaven; Chu, Chungming

    2012-07-06

    Model based control plays an important role for a modern accelerator during beam commissioning, beam study, and even daily operation. With a realistic model, beam behaviour can be predicted and therefore effectively controlled. The approach used by most current high level application environments is to use a built-in simulation engine and feed a realistic model into that simulation engine. Instead of this traditional monolithic structure, a new approach using a client-server architecture is under development. An on-line simulator server is accessed via network accessible structured data. With this approach, a user can easily access multiple simulation codes. This paper describesmore » the design, implementation, and current status of PVData, which defines the structured data, and PVAccess, which provides network access to the structured data.« less

  12. Assessing the accuracy of MISR and MISR-simulated cloud top heights using CloudSat- and CALIPSO-retrieved hydrometeor profiles

    NASA Astrophysics Data System (ADS)

    Hillman, Benjamin R.; Marchand, Roger T.; Ackerman, Thomas P.; Mace, Gerald G.; Benson, Sally

    2017-03-01

    Satellite retrievals of cloud properties are often used in the evaluation of global climate models, and in recent years satellite instrument simulators have been used to account for known retrieval biases in order to make more consistent comparisons between models and retrievals. Many of these simulators have seen little critical evaluation. Here we evaluate the Multiangle Imaging Spectroradiometer (MISR) simulator by using visible extinction profiles retrieved from a combination of CloudSat, CALIPSO, MODIS, and AMSR-E observations as inputs to the MISR simulator and comparing cloud top height statistics from the MISR simulator with those retrieved by MISR. Overall, we find that the occurrence of middle- and high-altitude topped clouds agrees well between MISR retrievals and the MISR-simulated output, with distributions of middle- and high-topped cloud cover typically agreeing to better than 5% in both zonal and regional averages. However, there are significant differences in the occurrence of low-topped clouds between MISR retrievals and MISR-simulated output that are due to differences in the detection of low-level clouds between MISR and the combined retrievals used to drive the MISR simulator, rather than due to errors in the MISR simulator cloud top height adjustment. This difference highlights the importance of sensor resolution and boundary layer cloud spatial structure in determining low-altitude cloud cover. The MISR-simulated and MISR-retrieved cloud optical depth also show systematic differences, which are also likely due in part to cloud spatial structure.

  13. Concurrent heterogeneous neural model simulation on real-time neuromimetic hardware.

    PubMed

    Rast, Alexander; Galluppi, Francesco; Davies, Sergio; Plana, Luis; Patterson, Cameron; Sharp, Thomas; Lester, David; Furber, Steve

    2011-11-01

    Dedicated hardware is becoming increasingly essential to simulate emerging very-large-scale neural models. Equally, however, it needs to be able to support multiple models of the neural dynamics, possibly operating simultaneously within the same system. This may be necessary either to simulate large models with heterogeneous neural types, or to simplify simulation and analysis of detailed, complex models in a large simulation by isolating the new model to a small subpopulation of a larger overall network. The SpiNNaker neuromimetic chip is a dedicated neural processor able to support such heterogeneous simulations. Implementing these models on-chip uses an integrated library-based tool chain incorporating the emerging PyNN interface that allows a modeller to input a high-level description and use an automated process to generate an on-chip simulation. Simulations using both LIF and Izhikevich models demonstrate the ability of the SpiNNaker system to generate and simulate heterogeneous networks on-chip, while illustrating, through the network-scale effects of wavefront synchronisation and burst gating, methods that can provide effective behavioural abstractions for large-scale hardware modelling. SpiNNaker's asynchronous virtual architecture permits greater scope for model exploration, with scalable levels of functional and temporal abstraction, than conventional (or neuromorphic) computing platforms. The complete system illustrates a potential path to understanding the neural model of computation, by building (and breaking) neural models at various scales, connecting the blocks, then comparing them against the biology: computational cognitive neuroscience. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Real-time cavity simulator-based low-level radio-frequency test bench and applications for accelerators

    NASA Astrophysics Data System (ADS)

    Qiu, Feng; Michizono, Shinichiro; Miura, Takako; Matsumoto, Toshihiro; Liu, Na; Wibowo, Sigit Basuki

    2018-03-01

    A Low-level radio-frequency (LLRF) control systems is required to regulate the rf field in the rf cavity used for beam acceleration. As the LLRF system is usually complex, testing of the basic functions or control algorithms of this system in real time and in advance of beam commissioning is strongly recommended. However, the equipment necessary to test the LLRF system, such as superconducting cavities and high-power rf sources, is very expensive; therefore, we have developed a field-programmable gate array (FPGA)-based cavity simulator as a substitute for real rf cavities. Digital models of the cavity and other rf systems are implemented in the FPGA. The main components include cavity baseband models for the fundamental and parasitic modes, a mechanical model of the Lorentz force detuning, and a model of the beam current. Furthermore, in our simulator, the disturbance model used to simulate the power-supply ripples and microphonics is also carefully considered. Based on the presented cavity simulator, we have established an LLRF system test bench that can be applied to different cavity operational conditions. The simulator performance has been verified by comparison with real cavities in KEK accelerators. In this paper, the development and implementation of this cavity simulator is presented first, and the LLRF test bench based on the presented simulator is constructed. The results are then compared with those for KEK accelerators. Finally, several LLRF applications of the cavity simulator are illustrated.

  15. Orion Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Hoelscher, Brian R.

    2007-01-01

    The Orion Entry, Descent, and Landing simulation was created over the past two years to serve as the primary Crew Exploration Vehicle guidance, navigation, and control (GN&C) design and analysis tool at the National Aeronautics and Space Administration (NASA). The Advanced NASA Technology Architecture for Exploration Studies (ANTARES) simulation is a six degree-of-freedom tool with a unique design architecture which has a high level of flexibility. This paper describes the decision history and motivations that guided the creation of this simulation tool. The capabilities of the models within ANTARES are presented in detail. Special attention is given to features of the highly flexible GN&C architecture and the details of the implemented GN&C algorithms. ANTARES provides a foundation simulation for the Orion Project that has already been successfully used for requirements analysis, system definition analysis, and preliminary GN&C design analysis. ANTARES will find useful application in engineering analysis, mission operations, crew training, avionics-in-the-loop testing, etc. This paper focuses on the entry simulation aspect of ANTARES, which is part of a bigger simulation package supporting the entire mission profile of the Orion vehicle. The unique aspects of entry GN&C design are covered, including how the simulation is being used for Monte Carlo dispersion analysis and for support of linear stability analysis. Sample simulation output from ANTARES is presented in an appendix.

  16. Evaluation of components, subsystems, and networks for high rate, high frequency space communications

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Ivancic, William D.; Zuzek, John E.

    1991-01-01

    The development of new space communications technologies by NASA has included both commercial applications and space science requirements. NASA's Systems Integration, Test and Evaluation (SITE) Space Communication System Simulator is a hardware based laboratory simulator for evaluating space communications technologies at the component, subsystem, system, and network level, geared toward high frequency, high data rate systems. The SITE facility is well-suited for evaluation of the new technologies required for the Space Exploration Initiative (SEI) and advanced commercial systems. Described here are the technology developments and evaluation requirements for current and planned commercial and space science programs. Also examined are the capabilities of SITE, the past, present and planned future configurations of the SITE facility, and applications of SITE to evaluation of SEI technology.

  17. Extending molecular simulation time scales: Parallel in time integrations for high-level quantum chemistry and complex force representations

    NASA Astrophysics Data System (ADS)

    Bylaska, Eric J.; Weare, Jonathan Q.; Weare, John H.

    2013-08-01

    Parallel in time simulation algorithms are presented and applied to conventional molecular dynamics (MD) and ab initio molecular dynamics (AIMD) models of realistic complexity. Assuming that a forward time integrator, f (e.g., Verlet algorithm), is available to propagate the system from time ti (trajectory positions and velocities xi = (ri, vi)) to time ti + 1 (xi + 1) by xi + 1 = fi(xi), the dynamics problem spanning an interval from t0…tM can be transformed into a root finding problem, F(X) = [xi - f(x(i - 1)]i = 1, M = 0, for the trajectory variables. The root finding problem is solved using a variety of root finding techniques, including quasi-Newton and preconditioned quasi-Newton schemes that are all unconditionally convergent. The algorithms are parallelized by assigning a processor to each time-step entry in the columns of F(X). The relation of this approach to other recently proposed parallel in time methods is discussed, and the effectiveness of various approaches to solving the root finding problem is tested. We demonstrate that more efficient dynamical models based on simplified interactions or coarsening time-steps provide preconditioners for the root finding problem. However, for MD and AIMD simulations, such preconditioners are not required to obtain reasonable convergence and their cost must be considered in the performance of the algorithm. The parallel in time algorithms developed are tested by applying them to MD and AIMD simulations of size and complexity similar to those encountered in present day applications. These include a 1000 Si atom MD simulation using Stillinger-Weber potentials, and a HCl + 4H2O AIMD simulation at the MP2 level. The maximum speedup (serial execution time/parallel execution time) obtained by parallelizing the Stillinger-Weber MD simulation was nearly 3.0. For the AIMD MP2 simulations, the algorithms achieved speedups of up to 14.3. The parallel in time algorithms can be implemented in a distributed computing environment using very slow transmission control protocol/Internet protocol networks. Scripts written in Python that make calls to a precompiled quantum chemistry package (NWChem) are demonstrated to provide an actual speedup of 8.2 for a 2.5 ps AIMD simulation of HCl + 4H2O at the MP2/6-31G* level. Implemented in this way these algorithms can be used for long time high-level AIMD simulations at a modest cost using machines connected by very slow networks such as WiFi, or in different time zones connected by the Internet. The algorithms can also be used with programs that are already parallel. Using these algorithms, we are able to reduce the cost of a MP2/6-311++G(2d,2p) simulation that had reached its maximum possible speedup in the parallelization of the electronic structure calculation from 32 s/time step to 6.9 s/time step.

  18. Extending molecular simulation time scales: Parallel in time integrations for high-level quantum chemistry and complex force representations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bylaska, Eric J.; Weare, Jonathan Q.; Weare, John H.

    2013-08-21

    Parallel in time simulation algorithms are presented and applied to conventional molecular dynamics (MD) and ab initio molecular dynamics (AIMD) models of realistic complexity. Assuming that a forward time integrator, f , (e.g. Verlet algorithm) is available to propagate the system from time ti (trajectory positions and velocities xi = (ri; vi)) to time ti+1 (xi+1) by xi+1 = fi(xi), the dynamics problem spanning an interval from t0 : : : tM can be transformed into a root finding problem, F(X) = [xi - f (x(i-1)]i=1;M = 0, for the trajectory variables. The root finding problem is solved using amore » variety of optimization techniques, including quasi-Newton and preconditioned quasi-Newton optimization schemes that are all unconditionally convergent. The algorithms are parallelized by assigning a processor to each time-step entry in the columns of F(X). The relation of this approach to other recently proposed parallel in time methods is discussed and the effectiveness of various approaches to solving the root finding problem are tested. We demonstrate that more efficient dynamical models based on simplified interactions or coarsening time-steps provide preconditioners for the root finding problem. However, for MD and AIMD simulations such preconditioners are not required to obtain reasonable convergence and their cost must be considered in the performance of the algorithm. The parallel in time algorithms developed are tested by applying them to MD and AIMD simulations of size and complexity similar to those encountered in present day applications. These include a 1000 Si atom MD simulation using Stillinger-Weber potentials, and a HCl+4H2O AIMD simulation at the MP2 level. The maximum speedup obtained by parallelizing the Stillinger-Weber MD simulation was nearly 3.0. For the AIMD MP2 simulations the algorithms achieved speedups of up to 14.3. The parallel in time algorithms can be implemented in a distributed computing environment using very slow TCP/IP networks. Scripts written in Python that make calls to a precompiled quantum chemistry package (NWChem) are demonstrated to provide an actual speedup of 8.2 for a 2.5 ps AIMD simulation of HCl+4H2O at the MP2/6-31G* level. Implemented in this way these algorithms can be used for long time high-level AIMD simulations at a modest cost using machines connected by very slow networks such as WiFi, or in different time zones connected by the Internet. The algorithms can also be used with programs that are already parallel. By using these algorithms we are able to reduce the cost of a MP2/6-311++G(2d,2p) simulation that had reached its maximum possible speedup in the parallelization of the electronic structure calculation from 32 seconds per time step to 6.9 seconds per time step.« less

  19. Extending molecular simulation time scales: Parallel in time integrations for high-level quantum chemistry and complex force representations.

    PubMed

    Bylaska, Eric J; Weare, Jonathan Q; Weare, John H

    2013-08-21

    Parallel in time simulation algorithms are presented and applied to conventional molecular dynamics (MD) and ab initio molecular dynamics (AIMD) models of realistic complexity. Assuming that a forward time integrator, f (e.g., Verlet algorithm), is available to propagate the system from time ti (trajectory positions and velocities xi = (ri, vi)) to time ti + 1 (xi + 1) by xi + 1 = fi(xi), the dynamics problem spanning an interval from t0[ellipsis (horizontal)]tM can be transformed into a root finding problem, F(X) = [xi - f(x(i - 1)]i = 1, M = 0, for the trajectory variables. The root finding problem is solved using a variety of root finding techniques, including quasi-Newton and preconditioned quasi-Newton schemes that are all unconditionally convergent. The algorithms are parallelized by assigning a processor to each time-step entry in the columns of F(X). The relation of this approach to other recently proposed parallel in time methods is discussed, and the effectiveness of various approaches to solving the root finding problem is tested. We demonstrate that more efficient dynamical models based on simplified interactions or coarsening time-steps provide preconditioners for the root finding problem. However, for MD and AIMD simulations, such preconditioners are not required to obtain reasonable convergence and their cost must be considered in the performance of the algorithm. The parallel in time algorithms developed are tested by applying them to MD and AIMD simulations of size and complexity similar to those encountered in present day applications. These include a 1000 Si atom MD simulation using Stillinger-Weber potentials, and a HCl + 4H2O AIMD simulation at the MP2 level. The maximum speedup (serial execution/timeparallel execution time) obtained by parallelizing the Stillinger-Weber MD simulation was nearly 3.0. For the AIMD MP2 simulations, the algorithms achieved speedups of up to 14.3. The parallel in time algorithms can be implemented in a distributed computing environment using very slow transmission control protocol/Internet protocol networks. Scripts written in Python that make calls to a precompiled quantum chemistry package (NWChem) are demonstrated to provide an actual speedup of 8.2 for a 2.5 ps AIMD simulation of HCl + 4H2O at the MP2/6-31G* level. Implemented in this way these algorithms can be used for long time high-level AIMD simulations at a modest cost using machines connected by very slow networks such as WiFi, or in different time zones connected by the Internet. The algorithms can also be used with programs that are already parallel. Using these algorithms, we are able to reduce the cost of a MP2/6-311++G(2d,2p) simulation that had reached its maximum possible speedup in the parallelization of the electronic structure calculation from 32 s/time step to 6.9 s/time step.

  20. Cosmic Rays with Portable Geiger Counters: From Sea Level to Airplane Cruise Altitudes

    ERIC Educational Resources Information Center

    Blanco, Francesco; La Rocca, Paola; Riggi, Francesco

    2009-01-01

    Cosmic ray count rates with a set of portable Geiger counters were measured at different altitudes on the way to a mountain top and aboard an aircraft, between sea level and cruise altitude. Basic measurements may constitute an educational activity even with high school teams. For the understanding of the results obtained, simulations of extensive…

Top